[Git][ghc/ghc][wip/inplace-22209] 14 commits: Factorize hptModulesBelow

Matthew Pickering (@mpickering) gitlab at gitlab.haskell.org
Fri Jan 27 10:14:48 UTC 2023



Matthew Pickering pushed to branch wip/inplace-22209 at Glasgow Haskell Compiler / GHC


Commits:
1bd32a35 by Sylvain Henry at 2023-01-26T12:34:21-05:00
Factorize hptModulesBelow

Create and use moduleGraphModulesBelow in GHC.Unit.Module.Graph that
doesn't need anything from the driver to be used.

- - - - -
1262d3f8 by Matthew Pickering at 2023-01-26T12:34:56-05:00
Store dehydrated data structures in CgModBreaks

This fixes a tricky leak in GHCi where we were retaining old copies of
HscEnvs when reloading. If not all modules were recompiled then these
hydrated fields in break points would retain a reference to the old
HscEnv which could double memory usage.

Fixes #22530

- - - - -
e27eb80c by Matthew Pickering at 2023-01-26T12:34:56-05:00
Force more in NFData Name instance

Doesn't force the lazy `OccName` field (#19619) which is already known
as a really bad source of leaks.

When we slam the hammer storing Names on disk (in interface files or the
like), all this should be forced as otherwise a `Name` can easily retain
an `Id` and hence the entire world.

Fixes #22833

- - - - -
3d004d5a by Matthew Pickering at 2023-01-26T12:34:56-05:00
Force OccName in tidyTopName

This occname has just been derived from an `Id`, so need to force it
promptly so we can release the Id back to the world.

Another symptom of the bug caused by #19619

- - - - -
f2a0fea0 by Matthew Pickering at 2023-01-26T12:34:56-05:00
Strict fields in ModNodeKey (otherwise retains HomeModInfo)

Towards #22530

- - - - -
5640cb1d by Sylvain Henry at 2023-01-26T12:35:36-05:00
Hadrian: fix doc generation

Was missing dependencies on files generated by templates (e.g.
ghc.cabal)

- - - - -
3e827c3f by Richard Eisenberg at 2023-01-26T20:06:53-05:00
Do newtype unwrapping in the canonicaliser and rewriter

See Note [Unwrap newtypes first], which has the details.

Close #22519.

- - - - -
b3ef5c89 by doyougnu at 2023-01-26T20:07:48-05:00
tryFillBuffer: strictify

more speculative bangs

- - - - -
d0d7ba0f by Vladislav Zavialov at 2023-01-26T20:08:25-05:00
base: NoImplicitPrelude in Data.Void and Data.Kind

This change removes an unnecessary dependency on Prelude
from two modules in the base package.

- - - - -
fa1db923 by Matthew Pickering at 2023-01-26T20:09:00-05:00
ci: Add ubuntu18_04 nightly and release jobs

This adds release jobs for ubuntu18_04 which uses glibc 2.27 which is
older than the 2.28 which is used by Rocky8 bindists.

Ticket #22268

- - - - -
807310a1 by Matthew Pickering at 2023-01-26T20:09:00-05:00
rel-eng: Add missing rocky8 bindist

We intend to release rocky8 bindist so the fetching script needs to know
about them.

- - - - -
c7116b10 by Ben Gamari at 2023-01-26T20:09:35-05:00
base: Make changelog proposal references more consistent

Addresses #22773.

- - - - -
6932cfc7 by Sylvain Henry at 2023-01-26T20:10:27-05:00
Fix spurious change from !9568

- - - - -
e2c5e820 by Matthew Pickering at 2023-01-27T10:14:26+00:00
hadrian: Fix library-dirs, dynamic-library-dirs and static-library-dirs in inplace .conf files

Previously we were just throwing away the contents of the library-dirs
fields but really we have to do the same thing as for include-dirs,
relativise the paths into the current working directory and maintain any
extra libraries the user has specified.

Now the relevant section of the rts.conf file looks like:

```
library-dirs:
    ${pkgroot}/../rts/build
    ${pkgroot}/../../..//_build/stage1/rts/build
    /nix/store/av4c0fib4rkmb6sa1074z0rb1ciria5b-gperftools-2.10/lib
    /nix/store/2infxahfp9lj084xn3q9ib5ajks8447i-libffi-3.4.4/lib

library-dirs-static:
    ${pkgroot}/../rts/build
    ${pkgroot}/../../..//_build/stage1/rts/build
    /nix/store/av4c0fib4rkmb6sa1074z0rb1ciria5b-gperftools-2.10/lib
    /nix/store/2infxahfp9lj084xn3q9ib5ajks8447i-libffi-3.4.4/lib

dynamic-library-dirs:
    ${pkgroot}/../rts/build
    ${pkgroot}/../../..//_build/stage1/rts/build
    /nix/store/av4c0fib4rkmb6sa1074z0rb1ciria5b-gperftools-2.10/lib
    /nix/store/2infxahfp9lj084xn3q9ib5ajks8447i-libffi-3.4.4/lib
```

Fixes #22209

- - - - -


29 changed files:

- .gitlab-ci.yml
- .gitlab/gen_ci.hs
- .gitlab/jobs.yaml
- .gitlab/rel_eng/fetch-gitlab-artifacts/fetch_gitlab.py
- compiler/GHC/ByteCode/Types.hs
- compiler/GHC/Core/Coercion.hs
- compiler/GHC/CoreToIface.hs
- compiler/GHC/Driver/Env.hs
- compiler/GHC/Iface/Tidy.hs
- compiler/GHC/IfaceToCore.hs
- compiler/GHC/Runtime/Eval.hs
- compiler/GHC/StgToByteCode.hs
- compiler/GHC/Tc/Instance/Family.hs
- compiler/GHC/Tc/Solver/Canonical.hs
- compiler/GHC/Tc/Solver/Monad.hs
- compiler/GHC/Tc/Solver/Rewrite.hs
- compiler/GHC/Tc/Types.hs
- compiler/GHC/Types/Name.hs
- compiler/GHC/Unit/Module/Graph.hs
- hadrian/src/Hadrian/Haskell/Cabal/Parse.hs
- hadrian/src/Rules/Documentation.hs
- hadrian/src/Rules/Generate.hs
- hadrian/src/Settings/Builders/RunTest.hs
- libraries/base/Data/Kind.hs
- libraries/base/Data/Void.hs
- libraries/base/GHC/Foreign.hs
- libraries/base/changelog.md
- + testsuite/tests/typecheck/should_compile/T22519.hs
- testsuite/tests/typecheck/should_compile/all.T


Changes:

=====================================
.gitlab-ci.yml
=====================================
@@ -2,7 +2,7 @@ variables:
   GIT_SSL_NO_VERIFY: "1"
 
   # Commit of ghc/ci-images repository from which to pull Docker images
-  DOCKER_REV: dd01591a50ea4e2aa3c106cf50ca54d38663f912
+  DOCKER_REV: 2d59d551647d102c4af44f257c520a94f04ea3f6
 
   # Sequential version number of all cached things.
   # Bump to invalidate GitLab CI cache.


=====================================
.gitlab/gen_ci.hs
=====================================
@@ -112,6 +112,7 @@ data LinuxDistro
   = Debian11 | Debian10 | Debian9
   | Fedora33
   | Ubuntu2004
+  | Ubuntu1804
   | Centos7
   | Alpine
   | Rocky8
@@ -270,6 +271,7 @@ distroName Debian11  = "deb11"
 distroName Debian10   = "deb10"
 distroName Debian9   = "deb9"
 distroName Fedora33  = "fedora33"
+distroName Ubuntu1804 = "ubuntu18_04"
 distroName Ubuntu2004 = "ubuntu20_04"
 distroName Centos7    = "centos7"
 distroName Alpine     = "alpine3_12"
@@ -860,6 +862,7 @@ job_groups =
      -- We still build Deb9 bindists for now due to Ubuntu 18 and Linux Mint 19
      -- not being at EOL until April 2023 and they still need tinfo5.
      , disableValidate (standardBuildsWithConfig Amd64 (Linux Debian9) (splitSectionsBroken vanilla))
+     , disableValidate (standardBuilds Amd64 (Linux Ubuntu1804))
      , disableValidate (standardBuilds Amd64 (Linux Ubuntu2004))
      , disableValidate (standardBuilds Amd64 (Linux Rocky8))
      , disableValidate (standardBuildsWithConfig Amd64 (Linux Centos7) (splitSectionsBroken vanilla))


=====================================
.gitlab/jobs.yaml
=====================================
@@ -1872,6 +1872,65 @@
       "XZ_OPT": "-9"
     }
   },
+  "nightly-x86_64-linux-ubuntu18_04-validate": {
+    "after_script": [
+      ".gitlab/ci.sh save_cache",
+      ".gitlab/ci.sh clean",
+      "cat ci_timings"
+    ],
+    "allow_failure": false,
+    "artifacts": {
+      "expire_in": "8 weeks",
+      "paths": [
+        "ghc-x86_64-linux-ubuntu18_04-validate.tar.xz",
+        "junit.xml"
+      ],
+      "reports": {
+        "junit": "junit.xml"
+      },
+      "when": "always"
+    },
+    "cache": {
+      "key": "x86_64-linux-ubuntu18_04-$CACHE_REV",
+      "paths": [
+        "cabal-cache",
+        "toolchain"
+      ]
+    },
+    "dependencies": [],
+    "image": "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-ubuntu18_04:$DOCKER_REV",
+    "needs": [
+      {
+        "artifacts": false,
+        "job": "hadrian-ghc-in-ghci"
+      }
+    ],
+    "rules": [
+      {
+        "if": "($CI_MERGE_REQUEST_LABELS !~ /.*fast-ci.*/) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+        "when": "on_success"
+      }
+    ],
+    "script": [
+      "sudo chown ghc:ghc -R .",
+      ".gitlab/ci.sh setup",
+      ".gitlab/ci.sh configure",
+      ".gitlab/ci.sh build_hadrian",
+      ".gitlab/ci.sh test_hadrian"
+    ],
+    "stage": "full-build",
+    "tags": [
+      "x86_64-linux"
+    ],
+    "variables": {
+      "BIGNUM_BACKEND": "gmp",
+      "BIN_DIST_NAME": "ghc-x86_64-linux-ubuntu18_04-validate",
+      "BUILD_FLAVOUR": "validate",
+      "CONFIGURE_ARGS": "",
+      "TEST_ENV": "x86_64-linux-ubuntu18_04-validate",
+      "XZ_OPT": "-9"
+    }
+  },
   "nightly-x86_64-linux-ubuntu20_04-validate": {
     "after_script": [
       ".gitlab/ci.sh save_cache",
@@ -3038,6 +3097,66 @@
       "XZ_OPT": "-9"
     }
   },
+  "release-x86_64-linux-ubuntu18_04-release": {
+    "after_script": [
+      ".gitlab/ci.sh save_cache",
+      ".gitlab/ci.sh clean",
+      "cat ci_timings"
+    ],
+    "allow_failure": false,
+    "artifacts": {
+      "expire_in": "1 year",
+      "paths": [
+        "ghc-x86_64-linux-ubuntu18_04-release.tar.xz",
+        "junit.xml"
+      ],
+      "reports": {
+        "junit": "junit.xml"
+      },
+      "when": "always"
+    },
+    "cache": {
+      "key": "x86_64-linux-ubuntu18_04-$CACHE_REV",
+      "paths": [
+        "cabal-cache",
+        "toolchain"
+      ]
+    },
+    "dependencies": [],
+    "image": "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-ubuntu18_04:$DOCKER_REV",
+    "needs": [
+      {
+        "artifacts": false,
+        "job": "hadrian-ghc-in-ghci"
+      }
+    ],
+    "rules": [
+      {
+        "if": "($CI_MERGE_REQUEST_LABELS !~ /.*fast-ci.*/) && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+        "when": "on_success"
+      }
+    ],
+    "script": [
+      "sudo chown ghc:ghc -R .",
+      ".gitlab/ci.sh setup",
+      ".gitlab/ci.sh configure",
+      ".gitlab/ci.sh build_hadrian",
+      ".gitlab/ci.sh test_hadrian"
+    ],
+    "stage": "full-build",
+    "tags": [
+      "x86_64-linux"
+    ],
+    "variables": {
+      "BIGNUM_BACKEND": "gmp",
+      "BIN_DIST_NAME": "ghc-x86_64-linux-ubuntu18_04-release",
+      "BUILD_FLAVOUR": "release",
+      "CONFIGURE_ARGS": "",
+      "IGNORE_PERF_FAILURES": "all",
+      "TEST_ENV": "x86_64-linux-ubuntu18_04-release",
+      "XZ_OPT": "-9"
+    }
+  },
   "release-x86_64-linux-ubuntu20_04-release": {
     "after_script": [
       ".gitlab/ci.sh save_cache",


=====================================
.gitlab/rel_eng/fetch-gitlab-artifacts/fetch_gitlab.py
=====================================
@@ -16,7 +16,9 @@ def job_triple(job_name):
     bindists = {
         'release-x86_64-windows-release': 'x86_64-unknown-mingw32',
         'release-x86_64-windows-int_native-release': 'x86_64-unknown-mingw32-int_native',
+        'release-x86_64-rocky8-release': 'x86_64-rocky8-linux',
         'release-x86_64-ubuntu20_04-release': 'x86_64-ubuntu20_04-linux',
+        'release-x86_64-ubuntu18_04-release': 'x86_64-ubuntu18_04-linux',
         'release-x86_64-linux-fedora33-release+debug_info': 'x86_64-fedora33-linux-dwarf',
         'release-x86_64-linux-fedora33-release': 'x86_64-fedora33-linux',
         'release-x86_64-linux-fedora27-release': 'x86_64-fedora27-linux',


=====================================
compiler/GHC/ByteCode/Types.hs
=====================================
@@ -23,12 +23,10 @@ import GHC.Prelude
 
 import GHC.Data.FastString
 import GHC.Data.SizedSeq
-import GHC.Types.Id
 import GHC.Types.Name
 import GHC.Types.Name.Env
 import GHC.Utils.Outputable
 import GHC.Builtin.PrimOps
-import GHC.Core.Type
 import GHC.Types.SrcLoc
 import GHCi.BreakArray
 import GHCi.RemoteTypes
@@ -41,10 +39,10 @@ import Data.Array.Base  ( UArray(..) )
 import Data.ByteString (ByteString)
 import Data.IntMap (IntMap)
 import qualified Data.IntMap as IntMap
-import Data.Maybe (catMaybes)
 import qualified GHC.Exts.Heap as Heap
 import GHC.Stack.CCS
 import GHC.Cmm.Expr ( GlobalRegSet, emptyRegSet, regSetToList )
+import GHC.Iface.Syntax
 
 -- -----------------------------------------------------------------------------
 -- Compiled Byte Code
@@ -174,18 +172,22 @@ instance NFData BCONPtr where
   rnf x = x `seq` ()
 
 -- | Information about a breakpoint that we know at code-generation time
+-- In order to be used, this needs to be hydrated relative to the current HscEnv by
+-- 'hydrateCgBreakInfo'. Everything here can be fully forced and that's critical for
+-- preventing space leaks (see #22530)
 data CgBreakInfo
    = CgBreakInfo
-   { cgb_vars   :: [Maybe (Id,Word16)]
-   , cgb_resty  :: Type
+   { cgb_tyvars :: ![IfaceTvBndr] -- ^ Type variables in scope at the breakpoint
+   , cgb_vars   :: ![Maybe (IfaceIdBndr, Word16)]
+   , cgb_resty  :: !IfaceType
    }
 -- See Note [Syncing breakpoint info] in GHC.Runtime.Eval
 
--- Not a real NFData instance because we can't rnf Id or Type
 seqCgBreakInfo :: CgBreakInfo -> ()
 seqCgBreakInfo CgBreakInfo{..} =
-  rnf (map snd (catMaybes (cgb_vars))) `seq`
-  seqType cgb_resty
+    rnf cgb_tyvars `seq`
+    rnf cgb_vars `seq`
+    rnf cgb_resty
 
 instance Outputable UnlinkedBCO where
    ppr (UnlinkedBCO nm _arity _insns _bitmap lits ptrs)


=====================================
compiler/GHC/Core/Coercion.hs
=====================================
@@ -1791,7 +1791,7 @@ topNormaliseNewType_maybe :: Type -> Maybe (Coercion, Type)
 --
 -- > topNormaliseNewType_maybe rec_nts ty = Just (co, ty')
 --
--- then (a)  @co : ty ~ ty'@.
+-- then (a)  @co : ty ~R ty'@.
 --      (b)  ty' is not a newtype.
 --
 -- The function returns @Nothing@ for non- at newtypes@,


=====================================
compiler/GHC/CoreToIface.hs
=====================================
@@ -43,12 +43,18 @@ module GHC.CoreToIface
     , toIfaceVar
       -- * Other stuff
     , toIfaceLFInfo
+      -- * CgBreakInfo
+    , dehydrateCgBreakInfo
     ) where
 
 import GHC.Prelude
 
+import Data.Word
+
 import GHC.StgToCmm.Types
 
+import GHC.ByteCode.Types
+
 import GHC.Core
 import GHC.Core.TyCon hiding ( pprPromotionQuote )
 import GHC.Core.Coercion.Axiom
@@ -685,6 +691,16 @@ toIfaceLFInfo nm lfi = case lfi of
     LFLetNoEscape ->
       panic "toIfaceLFInfo: LFLetNoEscape"
 
+-- Dehydrating CgBreakInfo
+
+dehydrateCgBreakInfo :: [TyVar] -> [Maybe (Id, Word16)] -> Type -> CgBreakInfo
+dehydrateCgBreakInfo ty_vars idOffSets tick_ty =
+          CgBreakInfo
+            { cgb_tyvars = map toIfaceTvBndr ty_vars
+            , cgb_vars = map (fmap (\(i, offset) -> (toIfaceIdBndr i, offset))) idOffSets
+            , cgb_resty = toIfaceType tick_ty
+            }
+
 {- Note [Inlining and hs-boot files]
 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 Consider this example (#10083, #12789):


=====================================
compiler/GHC/Driver/Env.hs
=====================================
@@ -1,4 +1,3 @@
-{-# LANGUAGE LambdaCase #-}
 
 module GHC.Driver.Env
    ( Hsc(..)
@@ -84,10 +83,7 @@ import GHC.Utils.Logger
 
 import Data.IORef
 import qualified Data.Set as Set
-import Data.Set (Set)
 import GHC.Unit.Module.Graph
-import Data.List (sort)
-import qualified Data.Map as Map
 
 runHsc :: HscEnv -> Hsc a -> IO a
 runHsc hsc_env (Hsc hsc) = do
@@ -267,35 +263,6 @@ hptAllThings :: (HomeModInfo -> [a]) -> HscEnv -> [a]
 hptAllThings extract hsc_env = concatMap (concatMap extract . eltsHpt . homeUnitEnv_hpt . snd)
                                 (hugElts (hsc_HUG hsc_env))
 
--- | This function returns all the modules belonging to the home-unit that can
--- be reached by following the given dependencies. Additionally, if both the
--- boot module and the non-boot module can be reached, it only returns the
--- non-boot one.
-hptModulesBelow :: HscEnv -> UnitId -> ModuleNameWithIsBoot -> Set ModNodeKeyWithUid
-hptModulesBelow hsc_env uid mn = filtered_mods $ [ mn |  NodeKey_Module mn <- modules_below]
-  where
-    td_map = mgTransDeps (hsc_mod_graph hsc_env)
-
-    modules_below = maybe [] Set.toList $ Map.lookup (NodeKey_Module (ModNodeKeyWithUid mn uid)) td_map
-
-    filtered_mods = Set.fromDistinctAscList . filter_mods . sort
-
-    -- IsBoot and NotBoot modules are necessarily consecutive in the sorted list
-    -- (cf Ord instance of GenWithIsBoot). Hence we only have to perform a
-    -- linear sweep with a window of size 2 to remove boot modules for which we
-    -- have the corresponding non-boot.
-    filter_mods = \case
-      (r1@(ModNodeKeyWithUid (GWIB m1 b1) uid1) : r2@(ModNodeKeyWithUid (GWIB m2 _) uid2): rs)
-        | m1 == m2  && uid1 == uid2 ->
-                       let !r' = case b1 of
-                                  NotBoot -> r1
-                                  IsBoot  -> r2
-                       in r' : filter_mods rs
-        | otherwise -> r1 : filter_mods (r2:rs)
-      rs -> rs
-
-
-
 -- | Get things from modules "below" this one (in the dependency sense)
 -- C.f Inst.hptInstances
 hptSomeThingsBelowUs :: (HomeModInfo -> [a]) -> Bool -> HscEnv -> UnitId -> ModuleNameWithIsBoot -> [a]
@@ -304,11 +271,12 @@ hptSomeThingsBelowUs extract include_hi_boot hsc_env uid mn
 
   | otherwise
   = let hug = hsc_HUG hsc_env
+        mg  = hsc_mod_graph hsc_env
     in
     [ thing
     |
     -- Find each non-hi-boot module below me
-      (ModNodeKeyWithUid (GWIB { gwib_mod = mod, gwib_isBoot = is_boot }) mod_uid) <- Set.toList (hptModulesBelow hsc_env uid mn)
+      (ModNodeKeyWithUid (GWIB { gwib_mod = mod, gwib_isBoot = is_boot }) mod_uid) <- Set.toList (moduleGraphModulesBelow mg uid mn)
     , include_hi_boot || (is_boot == NotBoot)
 
         -- unsavoury: when compiling the base package with --make, we
@@ -324,7 +292,7 @@ hptSomeThingsBelowUs extract include_hi_boot hsc_env uid mn
                     Nothing -> pprTrace "WARNING in hptSomeThingsBelowUs" msg mempty
           msg = vcat [text "missing module" <+> ppr mod,
                      text "When starting from"  <+> ppr mn,
-                     text "below:" <+> ppr (hptModulesBelow hsc_env uid mn),
+                     text "below:" <+> ppr (moduleGraphModulesBelow mg uid mn),
                       text "Probable cause: out-of-date interface files"]
                         -- This really shouldn't happen, but see #962
     , thing <- things


=====================================
compiler/GHC/Iface/Tidy.hs
=====================================
@@ -1077,7 +1077,8 @@ tidyTopName mod name_cache maybe_ref occ_env id
   -- we have to update the name cache in a nice atomic fashion
 
   | local  && internal = do uniq <- takeUniqFromNameCache name_cache
-                            let new_local_name = mkInternalName uniq occ' loc
+                            -- See #19619
+                            let new_local_name = occ' `seq` mkInternalName uniq occ' loc
                             return (occ_env', new_local_name)
         -- Even local, internal names must get a unique occurrence, because
         -- if we do -split-objs we externalise the name later, in the code generator


=====================================
compiler/GHC/IfaceToCore.hs
=====================================
@@ -12,6 +12,7 @@ Type checking of type signatures in interface files
 
 {-# OPTIONS_GHC -Wno-incomplete-record-updates #-}
 {-# LANGUAGE TupleSections #-}
+{-# LANGUAGE RecordWildCards #-}
 
 module GHC.IfaceToCore (
         tcLookupImported_maybe,
@@ -25,10 +26,15 @@ module GHC.IfaceToCore (
         tcIfaceExpr,    -- Desired by HERMIT (#7683)
         tcIfaceGlobal,
         tcIfaceOneShot, tcTopIfaceBindings,
+        hydrateCgBreakInfo
  ) where
 
 import GHC.Prelude
 
+import GHC.ByteCode.Types
+
+import Data.Word
+
 import GHC.Driver.Env
 import GHC.Driver.Session
 import GHC.Driver.Config.Core.Lint ( initLintConfig )
@@ -2166,3 +2172,12 @@ bindIfaceTyConBinderX :: (IfaceBndr -> (TyCoVar -> IfL a) -> IfL a)
 bindIfaceTyConBinderX bind_tv (Bndr tv vis) thing_inside
   = bind_tv tv $ \tv' ->
     thing_inside (Bndr tv' vis)
+
+-- CgBreakInfo
+
+hydrateCgBreakInfo :: CgBreakInfo -> IfL ([Maybe (Id, Word16)], Type)
+hydrateCgBreakInfo CgBreakInfo{..} = do
+  bindIfaceTyVars cgb_tyvars $ \_ -> do
+    result_ty <- tcIfaceType cgb_resty
+    mbVars <- mapM (traverse (\(if_gbl, offset) -> (,offset) <$> bindIfaceId if_gbl return)) cgb_vars
+    return (mbVars, result_ty)


=====================================
compiler/GHC/Runtime/Eval.hs
=====================================
@@ -136,6 +136,7 @@ import GHC.Tc.Solver (simplifyWantedsTcM)
 import GHC.Tc.Utils.Monad
 import GHC.Core.Class (classTyCon)
 import GHC.Unit.Env
+import GHC.IfaceToCore
 
 -- -----------------------------------------------------------------------------
 -- running a statement interactively
@@ -562,12 +563,19 @@ bindLocalsAtBreakpoint hsc_env apStack_fhv (Just BreakInfo{..}) = do
        breaks    = getModBreaks hmi
        info      = expectJust "bindLocalsAtBreakpoint2" $
                      IntMap.lookup breakInfo_number (modBreaks_breakInfo breaks)
-       mbVars    = cgb_vars info
-       result_ty = cgb_resty info
        occs      = modBreaks_vars breaks ! breakInfo_number
        span      = modBreaks_locs breaks ! breakInfo_number
        decl      = intercalate "." $ modBreaks_decls breaks ! breakInfo_number
 
+  -- Rehydrate to understand the breakpoint info relative to the current environment.
+  -- This design is critical to preventing leaks (#22530)
+   (mbVars, result_ty) <- initIfaceLoad hsc_env
+                            $ initIfaceLcl breakInfo_module (text "debugger") NotBoot
+                            $ hydrateCgBreakInfo info
+
+
+   let
+
            -- Filter out any unboxed ids by changing them to Nothings;
            -- we can't bind these at the prompt
        mbPointers = nullUnboxed <$> mbVars


=====================================
compiler/GHC/StgToByteCode.hs
=====================================
@@ -89,6 +89,7 @@ import Data.Either ( partitionEithers )
 
 import GHC.Stg.Syntax
 import qualified Data.IntSet as IntSet
+import GHC.CoreToIface
 
 -- -----------------------------------------------------------------------------
 -- Generating byte code for a complete module
@@ -370,10 +371,8 @@ schemeER_wrk d p (StgTick (Breakpoint tick_ty tick_no fvs) rhs)
         this_mod <- moduleName <$> getCurrentModule
         platform <- profilePlatform <$> getProfile
         let idOffSets = getVarOffSets platform d p fvs
-        let breakInfo = CgBreakInfo
-                        { cgb_vars = idOffSets
-                        , cgb_resty = tick_ty
-                        }
+            ty_vars   = tyCoVarsOfTypesWellScoped (tick_ty:map idType fvs)
+        let breakInfo = dehydrateCgBreakInfo ty_vars idOffSets tick_ty
         newBreakInfo tick_no breakInfo
         hsc_env <- getHscEnv
         let cc | Just interp <- hsc_interp hsc_env


=====================================
compiler/GHC/Tc/Instance/Family.hs
=====================================
@@ -519,16 +519,16 @@ tcLookupDataFamInst_maybe fam_inst_envs tc tc_args
   = Nothing
 
 -- | 'tcTopNormaliseNewTypeTF_maybe' gets rid of top-level newtypes,
--- potentially looking through newtype /instances/.
+-- potentially looking through newtype /instances/ and type synonyms.
 --
 -- It is only used by the type inference engine (specifically, when
 -- solving representational equality), and hence it is careful to unwrap
 -- only if the relevant data constructor is in scope.  That's why
 -- it gets a GlobalRdrEnv argument.
 --
--- It is careful not to unwrap data/newtype instances if it can't
--- continue unwrapping.  Such care is necessary for proper error
--- messages.
+-- It is careful not to unwrap data/newtype instances nor synonyms
+-- if it can't continue unwrapping.  Such care is necessary for proper
+-- error messages.
 --
 -- It does not look through type families.
 -- It does not normalise arguments to a tycon.


=====================================
compiler/GHC/Tc/Solver/Canonical.hs
=====================================
@@ -61,7 +61,6 @@ import GHC.Types.Basic
 
 import qualified Data.Semigroup as S
 import Data.Bifunctor ( bimap )
-import Data.Foldable ( traverse_ )
 
 {-
 ************************************************************************
@@ -1085,7 +1084,7 @@ can_eq_nc' _rewritten _rdr_env _envs ev eq_rel
 
 -- Decompose type constructor applications
 -- NB: we have expanded type synonyms already
-can_eq_nc' _rewritten _rdr_env _envs ev eq_rel ty1 _ ty2 _
+can_eq_nc' rewritten _rdr_env _envs ev eq_rel ty1 _ ty2 _
   | Just (tc1, tys1) <- tcSplitTyConApp_maybe ty1
   , Just (tc2, tys2) <- tcSplitTyConApp_maybe ty2
    -- we want to catch e.g. Maybe Int ~ (Int -> Int) here for better
@@ -1093,7 +1092,7 @@ can_eq_nc' _rewritten _rdr_env _envs ev eq_rel ty1 _ ty2 _
    -- hence no direct match on TyConApp
   , not (isTypeFamilyTyCon tc1)
   , not (isTypeFamilyTyCon tc2)
-  = canTyConApp ev eq_rel tc1 tys1 tc2 tys2
+  = canTyConApp rewritten ev eq_rel tc1 tys1 tc2 tys2
 
 can_eq_nc' _rewritten _rdr_env _envs ev eq_rel
            s1@(ForAllTy (Bndr _ vis1) _) _
@@ -1115,11 +1114,8 @@ can_eq_nc' True _rdr_env _envs ev NomEq ty1 _ ty2 _
 -------------------
 
 -- No similarity in type structure detected. Rewrite and try again.
-can_eq_nc' False rdr_env envs ev eq_rel _ ps_ty1 _ ps_ty2
-  = do { (redn1@(Reduction _ xi1), rewriters1) <- rewrite ev ps_ty1
-       ; (redn2@(Reduction _ xi2), rewriters2) <- rewrite ev ps_ty2
-       ; new_ev <- rewriteEqEvidence (rewriters1 S.<> rewriters2) ev NotSwapped redn1 redn2
-       ; can_eq_nc' True rdr_env envs new_ev eq_rel xi1 xi1 xi2 xi2 }
+can_eq_nc' False _rdr_env _envs ev eq_rel _ ps_ty1 _ ps_ty2
+  = rewrite_and_try_again ev eq_rel ps_ty1 ps_ty2
 
 ----------------------------
 -- Look for a canonical LHS. See Note [Canonical LHS].
@@ -1157,6 +1153,16 @@ can_eq_nc' True _rdr_env _envs ev eq_rel _ ps_ty1 _ ps_ty2
           -- No need to call canEqFailure/canEqHardFailure because they
           -- rewrite, and the types involved here are already rewritten
 
+-- Rewrite the two types and try again
+rewrite_and_try_again :: CtEvidence -> EqRel -> TcType -> TcType -> TcS (StopOrContinue Ct)
+rewrite_and_try_again ev eq_rel ty1 ty2
+  = do { (redn1@(Reduction _ xi1), rewriters1) <- rewrite ev ty1
+       ; (redn2@(Reduction _ xi2), rewriters2) <- rewrite ev ty2
+       ; new_ev <- rewriteEqEvidence (rewriters1 S.<> rewriters2) ev NotSwapped redn1 redn2
+       ; rdr_env <- getGlobalRdrEnvTcS
+       ; envs <- getFamInstEnvs
+       ; can_eq_nc' True rdr_env envs new_ev eq_rel xi1 xi1 xi2 xi2 }
+
 {- Note [Unsolved equalities]
 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 If we have an unsolved equality like
@@ -1379,45 +1385,84 @@ zonk_eq_types = go
     combine_rev f (Right tys) (Left elt) = Left (f <$> elt     <*> pure tys)
     combine_rev f (Right tys) (Right ty) = Right (f ty tys)
 
-{- See Note [Unwrap newtypes first]
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+{- Note [Unwrap newtypes first]
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 See also Note [Decomposing newtype equalities]
 
 Consider
   newtype N m a = MkN (m a)
-Then N will get a conservative, Nominal role for its second parameter 'a',
+N will get a conservative, Nominal role for its second parameter 'a',
 because it appears as an argument to the unknown 'm'. Now consider
   [W] N Maybe a  ~R#  N Maybe b
 
-If we decompose, we'll get
+If we /decompose/, we'll get
   [W] a ~N# b
 
-But if instead we unwrap we'll get
+But if instead we /unwrap/ we'll get
   [W] Maybe a ~R# Maybe b
 which in turn gives us
   [W] a ~R# b
 which is easier to satisfy.
 
-Bottom line: unwrap newtypes before decomposing them!
-c.f. #9123 comment:52,53 for a compelling example.
+Conclusion: we must unwrap newtypes before decomposing them. This happens
+in `can_eq_newtype_nc`
 
-Note [Newtypes can blow the stack]
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-Suppose we have
+But even this is challenging. Here are two cases to consider:
 
-  newtype X = MkX (Int -> X)
-  newtype Y = MkY (Int -> Y)
+Case 1:
+
+  newtype Age = MkAge Int
+  [G] c
+  [W] w1 :: IO Age ~R# IO Int
 
-and now wish to prove
+Case 2:
 
-  [W] X ~R Y
+  newtype A = MkA [A]
+  [W] A ~R# [A]
 
-This Wanted will loop, expanding out the newtypes ever deeper looking
-for a solid match or a solid discrepancy. Indeed, there is something
-appropriate to this looping, because X and Y *do* have the same representation,
-in the limit -- they're both (Fix ((->) Int)). However, no finitely-sized
-coercion will ever witness it. This loop won't actually cause GHC to hang,
-though, because we check our depth when unwrapping newtypes.
+For Case 1, recall that IO is an abstract newtype. Then read Note
+[Decomposing newtype equalities]. According to that Note, we should not
+decompose w1, because we have an Irred Given. Yet we still want to solve
+the wanted!  We can do so by unwrapping the (non-abstract) Age newtype
+underneath the IO, giving
+   [W] w2 :: IO Int ~R# IO Int
+   w1 = (IO unwrap-Age ; w2)
+where unwrap-Age :: Age ~R# Int. Now we case solve w2 by reflexivity;
+see Note [Eager reflexivity check].
+
+Conclusion: unwrap newtypes (deeply, inside types) in the rewriter:
+specifically in GHC.Tc.Solver.Rewrite.rewrite_newtype_app.
+
+Yet for Case 2, deep rewriting would be a disaster: we would loop.
+  [W] A ~R# [A] ---> {unwrap}
+                     [W] [A] ~R# [[A]]
+                ---> {decompose}
+                     [W] A ~R# [A]
+
+In this case, we just want to unwrap newtypes /at the top level/, allowing us
+to succeed via Note [Eager reflexivity check]:
+  [W] A ~R# [A] ---> {unwrap at top level only}
+                     [W] [A] ~R# [A]
+                ---> {reflexivity} success
+
+Conclusion: to satisfy Case 1 and Case 2, we unwrap
+* /both/ at top level, in can_eq_nc'
+* /and/ deeply, in the rewriter, rewrite_newtype_app
+
+The former unwraps outer newtypes (when the data constructor is in scope).
+The latter unwraps deeply -- but it won't be invoked in Case 2, when we can
+recognize an equality between the types [A] and [A] before rewriting
+deeply.
+
+This "before" business is delicate -- there is still a real risk of a loop
+in the type checker with recursive newtypes -- but I think we're doomed to do
+*something* delicate, as we're really trying to solve for equirecursive
+type equality. Bottom line for users: recursive newtypes are dangerous.
+See also Section 5.3.1 and 5.3.4 of
+"Safe Zero-cost Coercions for Haskell" (JFP 2016).
+
+Another approach -- which we ultimately decided against -- is described in
+Note [Decomposing newtypes a bit more aggressively].
 
 Note [Eager reflexivity check]
 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -1462,28 +1507,22 @@ can_eq_newtype_nc ev swapped ty1 ((gres, co1), ty1') ty2 ps_ty2
   = do { traceTcS "can_eq_newtype_nc" $
          vcat [ ppr ev, ppr swapped, ppr co1, ppr gres, ppr ty1', ppr ty2 ]
 
-         -- check for blowing our stack:
+         -- Check for blowing our stack, and increase the depth
          -- See Note [Newtypes can blow the stack]
-       ; checkReductionDepth (ctEvLoc ev) ty1
+       ; let loc = ctEvLoc ev
+             ev' = ev `setCtEvLoc` bumpCtLocDepth loc
+       ; checkReductionDepth loc ty1
 
          -- Next, we record uses of newtype constructors, since coercing
          -- through newtypes is tantamount to using their constructors.
-       ; addUsedGREs gre_list
-         -- If a newtype constructor was imported, don't warn about not
-         -- importing it...
-       ; traverse_ keepAlive $ map greMangledName gre_list
-         -- ...and similarly, if a newtype constructor was defined in the same
-         -- module, don't warn about it being unused.
-         -- See Note [Tracking unused binding and imports] in GHC.Tc.Utils.
+       ; recordUsedGREs gres
 
        ; let redn1 = mkReduction co1 ty1'
 
-       ; new_ev <- rewriteEqEvidence emptyRewriterSet ev swapped
+       ; new_ev <- rewriteEqEvidence emptyRewriterSet ev' swapped
                      redn1
                      (mkReflRedn Representational ps_ty2)
        ; can_eq_nc False new_ev ReprEq ty1' ty1' ty2 ps_ty2 }
-  where
-    gre_list = bagToList gres
 
 ---------
 -- ^ Decompose a type application.
@@ -1559,7 +1598,8 @@ canEqCast rewritten ev eq_rel swapped ty1 co1 ty2 ps_ty2
     role = eqRelRole eq_rel
 
 ------------------------
-canTyConApp :: CtEvidence -> EqRel
+canTyConApp :: Bool   -- True <=> the types have been rewritten
+            -> CtEvidence -> EqRel
             -> TyCon -> [TcType]
             -> TyCon -> [TcType]
             -> TcS (StopOrContinue Ct)
@@ -1567,13 +1607,17 @@ canTyConApp :: CtEvidence -> EqRel
 -- See Note [Decomposing Dependent TyCons and Processing Wanted Equalities]
 -- Neither tc1 nor tc2 is a saturated funTyCon, nor a type family
 -- But they can be data families.
-canTyConApp ev eq_rel tc1 tys1 tc2 tys2
+canTyConApp rewritten ev eq_rel tc1 tys1 tc2 tys2
   | tc1 == tc2
   , tys1 `equalLength` tys2
   = do { inerts <- getTcSInerts
        ; if can_decompose inerts
          then canDecomposableTyConAppOK ev eq_rel tc1 tys1 tys2
-         else canEqFailure ev eq_rel ty1 ty2 }
+         else if rewritten
+              then canEqFailure ev eq_rel ty1 ty2
+              else rewrite_and_try_again ev eq_rel ty1 ty2 }
+              -- Why rewrite and try again?  See Case 1
+              -- of Note [Unwrap newtypes first]
 
   -- See Note [Skolem abstract data] in GHC.Core.Tycon
   | tyConSkolem tc1 || tyConSkolem tc2
@@ -1851,8 +1895,12 @@ only about /completeness/.
 
 Note [Decomposing newtypes a bit more aggressively]
 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-c.f. https://github.com/ghc-proposals/ghc-proposals/pull/549,
-issue #22441, and discussion on !9282.
+IMPORTANT: the ideas in this Note are *not* implemented. Instead, the
+current approach is detailed in Note [Unwrap newtypes first].
+For more details about the ideas in this Note see
+  * GHC propoosal: https://github.com/ghc-proposals/ghc-proposals/pull/549
+  * issue #22441
+  * discussion on !9282.
 
 Consider [G] c, [W] (IO Int) ~R (IO Age)
 where IO is abstract, and
@@ -2091,8 +2139,8 @@ canEqHardFailure :: CtEvidence
 -- See Note [Make sure that insolubles are fully rewritten]
 canEqHardFailure ev ty1 ty2
   = do { traceTcS "canEqHardFailure" (ppr ty1 $$ ppr ty2)
-       ; (redn1, rewriters1) <- rewrite ev ty1
-       ; (redn2, rewriters2) <- rewrite ev ty2
+       ; (redn1, rewriters1) <- rewriteForErrors ev ty1
+       ; (redn2, rewriters2) <- rewriteForErrors ev ty2
        ; new_ev <- rewriteEqEvidence (rewriters1 S.<> rewriters2) ev NotSwapped redn1 redn2
        ; continueWith (mkIrredCt ShapeMismatchReason new_ev) }
 
@@ -3203,12 +3251,12 @@ rewriteEqEvidence new_rewriters old_ev swapped (Reduction lhs_co nlhs) (Reductio
   = do { let new_tm = evCoercion ( mkSymCo lhs_co
                                   `mkTransCo` maybeSymCo swapped (mkCoVarCo old_evar)
                                   `mkTransCo` rhs_co)
-       ; newGivenEvVar loc' (new_pred, new_tm) }
+       ; newGivenEvVar loc (new_pred, new_tm) }
 
   | CtWanted { ctev_dest = dest
              , ctev_rewriters = rewriters } <- old_ev
   , let rewriters' = rewriters S.<> new_rewriters
-  = do { (new_ev, hole_co) <- newWantedEq loc' rewriters'
+  = do { (new_ev, hole_co) <- newWantedEq loc rewriters'
                                           (ctEvRole old_ev) nlhs nrhs
        ; let co = maybeSymCo swapped $
                   lhs_co
@@ -3228,12 +3276,7 @@ rewriteEqEvidence new_rewriters old_ev swapped (Reduction lhs_co nlhs) (Reductio
 #endif
   where
     new_pred = mkTcEqPredLikeEv old_ev nlhs nrhs
-
-      -- equality is like a type class. Bumping the depth is necessary because
-      -- of recursive newtypes, where "reducing" a newtype can actually make
-      -- it bigger. See Note [Newtypes can blow the stack].
     loc      = ctEvLoc old_ev
-    loc'     = bumpCtLocDepth loc
 
 {-
 ************************************************************************


=====================================
compiler/GHC/Tc/Solver/Monad.hs
=====================================
@@ -25,7 +25,7 @@ module GHC.Tc.Solver.Monad (
     updWorkListTcS,
     pushLevelNoWorkList,
 
-    runTcPluginTcS, addUsedGRE, addUsedGREs, keepAlive,
+    runTcPluginTcS, recordUsedGREs,
     matchGlobalInst, TcM.ClsInstResult(..),
 
     QCInst(..),
@@ -151,7 +151,7 @@ import GHC.Core.TyCon
 import GHC.Types.Error ( mkPlainError, noHints )
 import GHC.Types.Name
 import GHC.Types.TyThing
-import GHC.Types.Name.Reader ( GlobalRdrEnv, GlobalRdrElt )
+import GHC.Types.Name.Reader
 
 import GHC.Unit.Module ( HasModule, getModule, extractModule )
 import qualified GHC.Rename.Env as TcM
@@ -175,7 +175,8 @@ import Control.Monad
 import GHC.Utils.Monad
 import Data.IORef
 import GHC.Exts (oneShot)
-import Data.List ( mapAccumL, partition, find )
+import Data.List ( mapAccumL, partition )
+import Data.Foldable
 
 #if defined(DEBUG)
 import GHC.Data.Graph.Directed
@@ -1373,17 +1374,22 @@ tcLookupClass c = wrapTcS $ TcM.tcLookupClass c
 tcLookupId :: Name -> TcS Id
 tcLookupId n = wrapTcS $ TcM.tcLookupId n
 
--- Setting names as used (used in the deriving of Coercible evidence)
--- Too hackish to expose it to TcS? In that case somehow extract the used
--- constructors from the result of solveInteract
-addUsedGREs :: [GlobalRdrElt] -> TcS ()
-addUsedGREs gres = wrapTcS  $ TcM.addUsedGREs gres
+-- Any use of this function is a bit suspect, because it violates the
+-- pure veneer of TcS. But it's just about warnings around unused imports
+-- and local constructors (GHC will issue fewer warnings than it otherwise
+-- might), so it's not worth losing sleep over.
+recordUsedGREs :: Bag GlobalRdrElt -> TcS ()
+recordUsedGREs gres
+  = do { wrapTcS $ TcM.addUsedGREs gre_list
+         -- If a newtype constructor was imported, don't warn about not
+         -- importing it...
+       ; wrapTcS $ traverse_ (TcM.keepAlive . greMangledName) gre_list }
+         -- ...and similarly, if a newtype constructor was defined in the same
+         -- module, don't warn about it being unused.
+         -- See Note [Tracking unused binding and imports] in GHC.Tc.Utils.
 
-addUsedGRE :: Bool -> GlobalRdrElt -> TcS ()
-addUsedGRE warn_if_deprec gre = wrapTcS $ TcM.addUsedGRE warn_if_deprec gre
-
-keepAlive :: Name -> TcS ()
-keepAlive = wrapTcS . TcM.keepAlive
+  where
+    gre_list = bagToList gres
 
 -- Various smaller utilities [TODO, maybe will be absorbed in the instance matcher]
 -- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~


=====================================
compiler/GHC/Tc/Solver/Rewrite.hs
=====================================
@@ -3,7 +3,7 @@
 {-# LANGUAGE DeriveFunctor #-}
 
 module GHC.Tc.Solver.Rewrite(
-   rewrite, rewriteArgsNom,
+   rewrite, rewriteForErrors, rewriteArgsNom,
    rewriteType
  ) where
 
@@ -42,6 +42,7 @@ import GHC.Builtin.Types (tYPETyCon)
 import Data.List ( find )
 import GHC.Data.List.Infinite (Infinite)
 import qualified GHC.Data.List.Infinite as Inf
+import GHC.Tc.Instance.Family (tcTopNormaliseNewTypeTF_maybe)
 
 {-
 ************************************************************************
@@ -223,6 +224,22 @@ rewrite ev ty
        ; traceTcS "rewrite }" (ppr $ reductionReducedType redn)
        ; return result }
 
+-- | See Note [Rewriting]
+-- This variant of 'rewrite' rewrites w.r.t. nominal equality only,
+-- as this is better than full rewriting for error messages. Specifically,
+-- we want to avoid unwrapping newtypes, as doing so can end up causing
+-- an otherwise-unnecessary stack overflow.
+rewriteForErrors :: CtEvidence -> TcType
+                 -> TcS (Reduction, RewriterSet)
+rewriteForErrors ev ty
+  = do { traceTcS "rewriteForErrors {" (ppr ty)
+       ; result@(redn, rewriters) <-
+           runRewrite (ctEvLoc ev) (ctEvFlavour ev) NomEq (rewrite_one ty)
+       ; traceTcS "rewriteForErrors }" (ppr $ reductionReducedType redn)
+       ; return $ case ctEvEqRel ev of
+           NomEq -> result
+           ReprEq -> (mkSubRedn redn, rewriters) }
+
 -- See Note [Rewriting]
 rewriteArgsNom :: CtEvidence -> TyCon -> [TcType]
                -> TcS (Reductions, RewriterSet)
@@ -482,16 +499,27 @@ rewrite_one (TyVarTy tv)
 rewrite_one (AppTy ty1 ty2)
   = rewrite_app_tys ty1 [ty2]
 
-rewrite_one (TyConApp tc tys)
+rewrite_one ty@(TyConApp tc tys)
   -- If it's a type family application, try to reduce it
   | isTypeFamilyTyCon tc
   = rewrite_fam_app tc tys
 
-  -- For * a normal data type application
-  --     * data family application
-  -- we just recursively rewrite the arguments.
   | otherwise
-  = rewrite_ty_con_app tc tys
+  = do { eq_rel <- getEqRel
+       ; if eq_rel == ReprEq
+
+         then -- Rewriting w.r.t. representational equality requires
+              --   unwrapping newtypes; see GHC.Tc.Solver.Canonical.
+              --   Note [Unwrap newtypes first]
+              -- NB: try rewrite_newtype_app even when tc isn't a newtype;
+              -- the allows the possibility of having a newtype buried under
+              -- a synonym. Needed for e.g. T12067.
+              rewrite_newtype_app ty
+
+         else -- For * a normal data type application
+              --     * data family application
+              -- we just recursively rewrite the arguments.
+              rewrite_ty_con_app tc tys }
 
 rewrite_one (FunTy { ft_af = vis, ft_mult = mult, ft_arg = ty1, ft_res = ty2 })
   = do { arg_redn <- rewrite_one ty1
@@ -650,7 +678,43 @@ rewrite_vector ki roles tys
     fvs                                = tyCoVarsOfType ki
 {-# INLINE rewrite_vector #-}
 
-{-
+-- Rewrite a (potential) newtype application
+-- Precondition: the ambient EqRel is ReprEq
+-- Precondition: the type is a TyConApp
+-- See Note [Newtypes can blow the stack]
+rewrite_newtype_app :: TcType -> RewriteM Reduction
+rewrite_newtype_app ty@(TyConApp tc tys)
+  = do { rdr_env <- liftTcS getGlobalRdrEnvTcS
+       ; tf_envs <- liftTcS getFamInstEnvs
+       ; case (tcTopNormaliseNewTypeTF_maybe tf_envs rdr_env ty) of
+           Nothing -> -- Non-newtype or abstract newtype
+                      rewrite_ty_con_app tc tys
+
+           Just ((used_ctors, co), ty')   -- co :: ty ~ ty'
+             -> do { liftTcS $ recordUsedGREs used_ctors
+                   ; checkStackDepth ty
+                   ; rewrite_reduction (Reduction co ty') } }
+
+rewrite_newtype_app other_ty = pprPanic "rewrite_newtype_app" (ppr other_ty)
+
+{- Note [Newtypes can blow the stack]
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+Suppose we have
+
+  newtype X = MkX (Int -> X)
+  newtype Y = MkY (Int -> Y)
+
+and now wish to prove
+
+  [W] X ~R Y
+
+This Wanted will loop, expanding out the newtypes ever deeper looking
+for a solid match or a solid discrepancy. Indeed, there is something
+appropriate to this looping, because X and Y *do* have the same representation,
+in the limit -- they're both (Fix ((->) Int)). However, no finitely-sized
+coercion will ever witness it. This loop won't actually cause GHC to hang,
+though, because we check our depth when unwrapping newtypes.
+
 Note [Rewriting synonyms]
 ~~~~~~~~~~~~~~~~~~~~~~~~~~
 Not expanding synonyms aggressively improves error messages, and


=====================================
compiler/GHC/Tc/Types.hs
=====================================
@@ -294,6 +294,7 @@ data RewriteEnv
        -- ^ At what role are we rewriting?
        --
        -- See Note [Rewriter EqRels] in GHC.Tc.Solver.Rewrite
+
        , re_rewriters :: !(TcRef RewriterSet)  -- ^ See Note [Wanteds rewrite Wanteds]
        }
 -- RewriteEnv is mostly used in @GHC.Tc.Solver.Rewrite@, but it is defined


=====================================
compiler/GHC/Types/Name.hs
=====================================
@@ -155,7 +155,7 @@ instance Outputable NameSort where
   ppr  System         = text "system"
 
 instance NFData Name where
-  rnf Name{..} = rnf n_sort
+  rnf Name{..} = rnf n_sort `seq` rnf n_occ `seq` n_uniq `seq` rnf n_loc
 
 instance NFData NameSort where
   rnf (External m) = rnf m


=====================================
compiler/GHC/Unit/Module/Graph.hs
=====================================
@@ -22,6 +22,7 @@ module GHC.Unit.Module.Graph
    , showModMsg
    , moduleGraphNodeModule
    , moduleGraphNodeModSum
+   , moduleGraphModulesBelow
 
    , moduleGraphNodes
    , SummaryNode
@@ -62,12 +63,14 @@ import System.FilePath
 import qualified Data.Map as Map
 import GHC.Types.Unique.DSet
 import qualified Data.Set as Set
+import Data.Set (Set)
 import GHC.Unit.Module
 import GHC.Linker.Static.Utils
 
 import Data.Bifunctor
 import Data.Either
 import Data.Function
+import Data.List (sort)
 import GHC.Data.List.SetOps
 
 -- | A '@ModuleGraphNode@' is a node in the '@ModuleGraph@'.
@@ -131,8 +134,8 @@ nodeKeyModName :: NodeKey -> Maybe ModuleName
 nodeKeyModName (NodeKey_Module mk) = Just (gwib_mod $ mnkModuleName mk)
 nodeKeyModName _ = Nothing
 
-data ModNodeKeyWithUid = ModNodeKeyWithUid { mnkModuleName :: ModuleNameWithIsBoot
-                                           , mnkUnitId     :: UnitId } deriving (Eq, Ord)
+data ModNodeKeyWithUid = ModNodeKeyWithUid { mnkModuleName :: !ModuleNameWithIsBoot
+                                           , mnkUnitId     :: !UnitId } deriving (Eq, Ord)
 
 instance Outputable ModNodeKeyWithUid where
   ppr (ModNodeKeyWithUid mnwib uid) = ppr uid <> colon <> ppr mnwib
@@ -385,3 +388,30 @@ msKey ms = ModNodeKeyWithUid (ms_mnwib ms) (ms_unitid ms)
 
 type ModNodeKey = ModuleNameWithIsBoot
 
+
+-- | This function returns all the modules belonging to the home-unit that can
+-- be reached by following the given dependencies. Additionally, if both the
+-- boot module and the non-boot module can be reached, it only returns the
+-- non-boot one.
+moduleGraphModulesBelow :: ModuleGraph -> UnitId -> ModuleNameWithIsBoot -> Set ModNodeKeyWithUid
+moduleGraphModulesBelow mg uid mn = filtered_mods $ [ mn |  NodeKey_Module mn <- modules_below]
+  where
+    td_map = mgTransDeps mg
+
+    modules_below = maybe [] Set.toList $ Map.lookup (NodeKey_Module (ModNodeKeyWithUid mn uid)) td_map
+
+    filtered_mods = Set.fromDistinctAscList . filter_mods . sort
+
+    -- IsBoot and NotBoot modules are necessarily consecutive in the sorted list
+    -- (cf Ord instance of GenWithIsBoot). Hence we only have to perform a
+    -- linear sweep with a window of size 2 to remove boot modules for which we
+    -- have the corresponding non-boot.
+    filter_mods = \case
+      (r1@(ModNodeKeyWithUid (GWIB m1 b1) uid1) : r2@(ModNodeKeyWithUid (GWIB m2 _) uid2): rs)
+        | m1 == m2  && uid1 == uid2 ->
+                       let !r' = case b1 of
+                                  NotBoot -> r1
+                                  IsBoot  -> r2
+                       in r' : filter_mods rs
+        | otherwise -> r1 : filter_mods (r2:rs)
+      rs -> rs


=====================================
hadrian/src/Hadrian/Haskell/Cabal/Parse.hs
=====================================
@@ -319,8 +319,11 @@ write_inplace_conf pkg_path res_path pd lbi = do
                   pkg_name = C.display (C.pkgName (CP.sourcePackageId installedPkgInfo))
                   final_ipi = installedPkgInfo {
                                  Installed.includeDirs = concatMap fixupIncludeDir (Installed.includeDirs installedPkgInfo),
-                                 Installed.libraryDirs = [ build_dir ],
-                                 Installed.libraryDynDirs = [ build_dir ],
+                                 Installed.libraryDirs = build_dir: (concatMap fixupIncludeDir (Installed.libraryDirs installedPkgInfo)) ,
+#if MIN_VERSION_Cabal(3,8,0)
+                                 Installed.libraryDirsStatic = build_dir: (concatMap fixupIncludeDir (Installed.libraryDirsStatic installedPkgInfo)) ,
+#endif
+                                 Installed.libraryDynDirs = build_dir : (concatMap fixupIncludeDir (Installed.libraryDynDirs installedPkgInfo)) ,
                                  Installed.dataDir = "${pkgroot}/../../../../" ++ pkg_path,
                                  Installed.haddockHTMLs = [build_dir ++ "/doc/html/" ++ C.display (CP.sourcePackageId installedPkgInfo)],
                                  Installed.haddockInterfaces = [build_dir ++ "/doc/html/" ++  pkg_name ++ "/" ++ pkg_name ++ ".haddock"],


=====================================
hadrian/src/Rules/Documentation.hs
=====================================
@@ -12,7 +12,7 @@ import Hadrian.BuildPath
 import Hadrian.Haskell.Cabal
 import Hadrian.Haskell.Cabal.Type
 
-import Rules.Generate (ghcPrimDependencies)
+import Rules.Generate (ghcPrimDependencies, generateTemplateResults)
 import Base
 import Context
 import Expression (getContextData, interpretInContext, (?), package)
@@ -68,6 +68,12 @@ pathPath "users_guide" = "docs/users_guide"
 pathPath "Haddock" = "utils/haddock/doc"
 pathPath _ = ""
 
+-- Generate files required to build the docs (e.g. ghc.cabal)
+needDocDeps :: Action ()
+needDocDeps = do
+  -- build .cabal files used by the doc engine to list package versions
+  generateTemplateResults
+
 -- | Build all documentation
 documentationRules :: Rules ()
 documentationRules = do
@@ -188,6 +194,9 @@ buildSphinxHtml :: FilePath -> Rules ()
 buildSphinxHtml path = do
     root <- buildRootRules
     root -/- htmlRoot -/- path -/- "index.html" %> \file -> do
+
+        needDocDeps
+
         let dest = takeDirectory file
             rstFilesDir = pathPath path
         rstFiles <- getDirectoryFiles rstFilesDir ["**/*.rst"]
@@ -301,6 +310,9 @@ buildSphinxPdf :: FilePath -> Rules ()
 buildSphinxPdf path = do
     root <- buildRootRules
     root -/- pdfRoot -/- path <.> "pdf" %> \file -> do
+
+        needDocDeps
+
         withTempDir $ \dir -> do
             let rstFilesDir = pathPath path
             rstFiles <- getDirectoryFiles rstFilesDir ["**/*.rst"]


=====================================
hadrian/src/Rules/Generate.hs
=====================================
@@ -2,7 +2,7 @@ module Rules.Generate (
     isGeneratedCmmFile, compilerDependencies, generatePackageCode,
     generateRules, copyRules, generatedDependencies,
     ghcPrimDependencies,
-    templateRules
+    templateRules, generateTemplateResults
     ) where
 
 import qualified Data.Set as Set
@@ -243,7 +243,6 @@ templateResults =
     , "driver/ghci/ghci-wrapper.cabal"
     , "ghc/ghc-bin.cabal"
     , "utils/iserv/iserv.cabal"
-    , "utils/iserv-proxy/iserv-proxy.cabal"
     , "utils/remote-iserv/remote-iserv.cabal"
     , "utils/runghc/runghc.cabal"
     , "libraries/ghc-boot/ghc-boot.cabal"
@@ -256,6 +255,10 @@ templateResults =
     , "libraries/prologue.txt"
     ]
 
+-- | Generate all the files we know we have a template for
+generateTemplateResults :: Action ()
+generateTemplateResults = need templateResults
+
 templateRules :: Rules ()
 templateRules = do
   templateResults |%> \out -> do


=====================================
hadrian/src/Settings/Builders/RunTest.hs
=====================================
@@ -145,7 +145,7 @@ outOfTreeCompilerArgs = do
     withNativeCodeGen   <- getBooleanSetting TestGhcWithNativeCodeGen
     withInterpreter     <- getBooleanSetting TestGhcWithInterpreter
     unregisterised      <- getBooleanSetting TestGhcUnregisterised
-    tables_next_to_code <- getBooleanSetting TestGhcUnregisterised
+    tables_next_to_code <- getBooleanSetting TestGhcTablesNextToCode
     targetWithSMP       <- targetSupportsSMP
     debugAssertions     <- getBooleanSetting TestGhcDebugged
 


=====================================
libraries/base/Data/Kind.hs
=====================================
@@ -1,4 +1,5 @@
-{-# LANGUAGE Trustworthy, ExplicitNamespaces #-}
+{-# LANGUAGE Trustworthy #-}
+{-# LANGUAGE NoImplicitPrelude #-}
 
 -----------------------------------------------------------------------------
 -- |


=====================================
libraries/base/Data/Void.hs
=====================================
@@ -1,4 +1,5 @@
 {-# LANGUAGE Trustworthy #-}
+{-# LANGUAGE NoImplicitPrelude #-}
 
 -----------------------------------------------------------------------------
 -- |


=====================================
libraries/base/GHC/Foreign.hs
=====================================
@@ -202,11 +202,11 @@ peekEncodedCString :: TextEncoding -- ^ Encoding of CString
 peekEncodedCString (TextEncoding { mkTextDecoder = mk_decoder }) (p, sz_bytes)
   = bracket mk_decoder close $ \decoder -> do
       let chunk_size = sz_bytes `max` 1 -- Decode buffer chunk size in characters: one iteration only for ASCII
-      from0 <- fmap (\fp -> bufferAdd sz_bytes (emptyBuffer fp sz_bytes ReadBuffer)) $ newForeignPtr_ (castPtr p)
-      to <- newCharBuffer chunk_size WriteBuffer
+      !from0 <- fmap (\fp -> bufferAdd sz_bytes (emptyBuffer fp sz_bytes ReadBuffer)) $ newForeignPtr_ (castPtr p)
+      !to    <- newCharBuffer chunk_size WriteBuffer
 
-      let go !iteration from = do
-            (why, from', to') <- encode decoder from to
+      let go !iteration !from = do
+            (why, from', !to') <- encode decoder from to
             if isEmptyBuffer from'
              then
               -- No input remaining: @why@ will be InputUnderflow, but we don't care
@@ -281,11 +281,11 @@ newEncodedCString (TextEncoding { mkTextEncoder = mk_encoder }) null_terminate s
 
 tryFillBuffer :: TextEncoder dstate -> Bool -> Buffer Char -> Ptr Word8 -> Int
                     ->  IO (Maybe (Buffer Word8))
-tryFillBuffer encoder null_terminate from0 to_p to_sz_bytes = do
-    to_fp <- newForeignPtr_ to_p
-    go (0 :: Int) (from0, emptyBuffer to_fp to_sz_bytes WriteBuffer)
+tryFillBuffer encoder null_terminate from0 to_p !to_sz_bytes = do
+    !to_fp <- newForeignPtr_ to_p
+    go (0 :: Int) from0 (emptyBuffer to_fp to_sz_bytes WriteBuffer)
   where
-    go !iteration (from, to) = do
+    go !iteration !from !to = do
       (why, from', to') <- encode encoder from to
       putDebugMsg ("tryFillBufferAndCall: " ++ show iteration ++ " " ++ show why ++ " " ++ summaryBuffer from ++ " " ++ summaryBuffer from')
       if isEmptyBuffer from'
@@ -293,8 +293,8 @@ tryFillBuffer encoder null_terminate from0 to_p to_sz_bytes = do
              then return Nothing -- We had enough for the string but not the terminator: ask the caller for more buffer
              else return (Just to')
        else case why of -- We didn't consume all of the input
-              InputUnderflow  -> recover encoder from' to' >>= go (iteration + 1) -- These conditions are equally bad
-              InvalidSequence -> recover encoder from' to' >>= go (iteration + 1) -- since the input was truncated/invalid
+              InputUnderflow  -> recover encoder from' to' >>= \(a,b) -> go (iteration + 1) a b -- These conditions are equally bad
+              InvalidSequence -> recover encoder from' to' >>= \(a,b) -> go (iteration + 1) a b -- since the input was truncated/invalid
               OutputUnderflow -> return Nothing -- Oops, out of buffer during decoding: ask the caller for more
 {-
 Note [Check *before* fill in withEncodedCString]


=====================================
libraries/base/changelog.md
=====================================
@@ -1,20 +1,23 @@
 # Changelog for [`base` package](http://hackage.haskell.org/package/base)
 
 ## 4.18.0.0 *TBA*
-  * Add `forall a. Functor (p a)` superclass for `Bifunctor p`.
-  * Add Functor instances for `(,,,,) a b c d`, `(,,,,,) a b c d e` and
-    `(,,,,,) a b c d e f`.
+  * Add `forall a. Functor (p a)` superclass for `Bifunctor p` ([CLC proposal #91](https://github.com/haskell/core-libraries-committee/issues/91))
+  * Add `Functor` instances for `(,,,,) a b c d`, `(,,,,,) a b c d e` and
+    `(,,,,,) a b c d e f`
   * Exceptions thrown by weak pointer finalizers are now reported via a global
     exception handler.
-  * Add `GHC.Weak.Finalize.{get,set}FinalizerExceptionHandler` which the user to
-    override the above-mentioned handler.
-  * `Numeric.Natural` re-exports `GHC.Natural.minusNaturalMaybe`.
-  * Add `Data.Foldable1` and `Data.Bifoldable1`.
-  * Add `applyWhen` to `Data.Function`.
-  * Add functions `mapAccumM` and `forAccumM` to `Data.Traversable`, per the
-    [Core Libraries proposal](https://github.com/haskell/core-libraries-committee/issues/65).
+  * Add `GHC.Weak.Finalize.{get,set}FinalizerExceptionHandler` which allows the
+    user to override the above-mentioned handler.
+  * `Numeric.Natural` re-exports `GHC.Natural.minusNaturalMaybe`
+    ([CLC proposal #45](https://github.com/haskell/core-libraries-committee/issues/45))
+  * Add `Data.Foldable1` and `Data.Bifoldable1`
+    ([CLC proposal #9](https://github.com/haskell/core-libraries-committee/issues/9))
+  * Add `applyWhen` to `Data.Function`
+    ([CLC proposal #71](https://github.com/haskell/core-libraries-committee/issues/71))
+  * Add functions `mapAccumM` and `forAccumM` to `Data.Traversable`
+    ([CLC proposal #65](https://github.com/haskell/core-libraries-committee/issues/65))
   * Add default implementation of `(<>)` in terms of `sconcat` and `mempty` in
-    terms of `mconcat`.
+    terms of `mconcat` ([CLC proposal #61](https://github.com/haskell/core-libraries-committee/issues/61)).
   * `GHC.Conc.Sync.listThreads` was added, allowing the user to list the threads
     (both running and blocked) of the program.
   * `GHC.Conc.Sync.labelThreadByteArray#` was added, allowing the user to specify
@@ -23,25 +26,24 @@
     function.
   * `GHC.Conc.Sync.threadLabel` was added, allowing the user to query the label
     of a given `ThreadId`.
-  * Add `inits1` and `tails1` to `Data.List.NonEmpty`.
+  * Add `inits1` and `tails1` to `Data.List.NonEmpty`
+    ([CLC proposal #67](https://github.com/haskell/core-libraries-committee/issues/67))
   * Change default `Ord` implementation of `(>=)`, `(>)`, and `(<)` to use
-    `(<=)` instead of `compare` per
-    [Core Libraries proposal](https://github.com/haskell/core-libraries-committee/issues/24).
+    `(<=)` instead of `compare` ([CLC proposal #24](https://github.com/haskell/core-libraries-committee/issues/24)).
   * Export `liftA2` from `Prelude`. This means that the entirety of `Applicative`
-    is now exported from `Prelude`. See [CLC #50](https://github.com/haskell/core-libraries-committee/issues/50)
-    for the related discussion,
-    as well as [the migration guide](https://github.com/haskell/core-libraries-committee/blob/main/guides/export-lifta2-prelude.md)
+    is now exported from `Prelude`
+    ([CLC proposal #50](https://github.com/haskell/core-libraries-committee/issues/50),
+    [the migration
+    guide](https://github.com/haskell/core-libraries-committee/blob/main/guides/export-lifta2-prelude.md))
   * Update to [Unicode 15.0.0](https://www.unicode.org/versions/Unicode15.0.0/).
   * Add standard Unicode case predicates `isUpperCase` and `isLowerCase` to
     `GHC.Unicode` and `Data.Char`. These predicates use the standard Unicode
-    case properties and are more intuitive than `isUpper` and `isLower`. See
-    [CLC proposal #90](https://github.com/haskell/core-libraries-committee/issues/90).
+    case properties and are more intuitive than `isUpper` and `isLower`
+    ([CLC proposal #90](https://github.com/haskell/core-libraries-committee/issues/90))
   * Add `Eq` and `Ord` instances for `Generically1`.
   * Relax instances for Functor combinators; put superclass on Class1 and Class2
-    to make non-breaking. See [CLC
-    #10](https://github.com/haskell/core-libraries-committee/issues/10) for the
-    related discussion, as well as [the migration
-    guide](https://github.com/haskell/core-libraries-committee/blob/main/guides/functor-combinator-instances-and-class1s.md).
+    to make non-breaking ([CLC proposal #10](https://github.com/haskell/core-libraries-committee/issues/10), 
+    [migration guide](https://github.com/haskell/core-libraries-committee/blob/main/guides/functor-combinator-instances-and-class1s.md))
   * Add `gcdetails_block_fragmentation_bytes` to `GHC.Stats.GCDetails` to track heap fragmentation.
   * `GHC.TypeLits` and `GHC.TypeNats` now export the `natSing`, `symbolSing`,
     and `charSing` methods of `KnownNat`, `KnownSymbol`, and `KnownChar`,
@@ -50,16 +52,16 @@
     types, per
     [CLC proposal #85](https://github.com/haskell/core-libraries-committee/issues/85).
   * The `Enum` instance of `Down a` now enumerates values in the opposite
-    order as the `Enum a` instance, per
-    [CLC proposal #51](https://github.com/haskell/core-libraries-committee/issues/51).
+    order as the `Enum a` instance ([CLC proposal #51](https://github.com/haskell/core-libraries-committee/issues/51))
   * `Foreign.Marshal.Pool` now uses the RTS internal arena instead of libc
     `malloc` for allocation. It avoids the O(n) overhead of maintaining a list
     of individually allocated pointers as well as freeing each one of them when
-    freeing a `Pool`. (#14762) (#18338)
+    freeing a `Pool` (#14762, #18338)
   * `Type.Reflection.Unsafe` is now marked as unsafe.
-  * Add `Data.Typeable.heqT`, a kind-heterogeneous version of `Data.Typeable.eqT`.
-  * Add `Data.List.!?` per
-    [CLC proposal #110](https://github.com/haskell/core-libraries-committee/issues/110).
+  * Add `Data.Typeable.heqT`, a kind-heterogeneous version of
+    `Data.Typeable.eqT`
+    ([CLC proposal #99](https://github.com/haskell/core-libraries-committee/issues/99))
+  * Add `Data.List.!?` ([CLC proposal #110](https://github.com/haskell/core-libraries-committee/issues/110))
   * `maximumBy`/`minimumBy` are now marked as `INLINE` improving performance for unpackable
     types significantly.
 


=====================================
testsuite/tests/typecheck/should_compile/T22519.hs
=====================================
@@ -0,0 +1,21 @@
+{-# LANGUAGE Haskell2010 #-}
+{-# LANGUAGE DataKinds #-}
+{-# LANGUAGE TypeFamilies #-}
+{-# LANGUAGE TypeOperators #-}
+module Bug where
+
+import Data.Coerce (coerce)
+import Data.Kind (Type)
+import GHC.TypeNats (Nat, type (<=))
+
+f :: (1 <= w)
+  => IO (SymBV' sym w)
+  -> IO (SymBV sym w)
+f = coerce
+
+----
+
+data BaseType = BaseBVType Nat
+type family SymExpr (sym :: Type) :: BaseType -> Type
+type SymBV sym n = SymExpr sym (BaseBVType n)
+newtype SymBV' sym w = MkSymBV' (SymBV sym w)


=====================================
testsuite/tests/typecheck/should_compile/all.T
=====================================
@@ -457,6 +457,7 @@ test('T10335', normal, compile, [''])
 test('Improvement', normal, compile, [''])
 test('T10009', normal, compile, [''])
 test('T10390', normal, compile, [''])
+test('T22519', normal, compile, [''])
 test('T8555', normal, compile, [''])
 test('T8799', normal, compile, [''])
 test('T10432', normal, compile, [''])



View it on GitLab: https://gitlab.haskell.org/ghc/ghc/-/compare/91b381abf5e4728f76f5701afe5e68c295cd262a...e2c5e8200c70ab98c22911a45a122d316766b510

-- 
View it on GitLab: https://gitlab.haskell.org/ghc/ghc/-/compare/91b381abf5e4728f76f5701afe5e68c295cd262a...e2c5e8200c70ab98c22911a45a122d316766b510
You're receiving this email because of your account on gitlab.haskell.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.haskell.org/pipermail/ghc-commits/attachments/20230127/8c785796/attachment-0001.html>


More information about the ghc-commits mailing list