[Git][ghc/ghc][wip/marge_bot_batch_merge_job] 4 commits: Delete created temporary subdirectories at end of session.

Marge Bot (@marge-bot) gitlab at gitlab.haskell.org
Fri Mar 10 12:38:58 UTC 2023



Marge Bot pushed to branch wip/marge_bot_batch_merge_job at Glasgow Haskell Compiler / GHC


Commits:
f97c7f6d by Luite Stegeman at 2023-03-09T09:52:09-05:00
Delete created temporary subdirectories at end of session.

This patch adds temporary subdirectories to the list of
paths do clean up at the end of the GHC session. This
fixes warnings about non-empty temporary directories.

Fixes #22952

- - - - -
9ea719f2 by Apoorv Ingle at 2023-03-09T09:52:45-05:00
Fixes #19627.

Previously the solver failed with an unhelpful "solver reached too may iterations" error.
With the fix for #21909 in place we no longer have the possibility of generating such an error if we have `-fconstraint-solver-iteration` > `-fgivens-fuel > `-fwanteds-fuel`. This is true by default, and the said fix also gives programmers a knob to control how hard the solver should try before giving up.

This commit adds:
* Reference to ticket #19627 in the Note [Expanding Recursive Superclasses and ExpansionFuel]
* Test `typecheck/should_fail/T19627.hs` for regression purposes

- - - - -
26170dee by Sebastian Graf at 2023-03-10T07:38:36-05:00
DmdAnal: Fix a panic on OPAQUE and trivial/PAP RHS (#22997)

We should not panic in `add_demands` (now `set_lam_dmds`), because that code
path is legimitely taken for OPAQUE PAP bindings, as in T22997.

Fixes #22997.

- - - - -
32163a18 by Sylvain Henry at 2023-03-10T07:38:48-05:00
JS: remove dead code for old integer-gmp

- - - - -


10 changed files:

- compiler/GHC/Core/Opt/DmdAnal.hs
- compiler/GHC/Linker/Static.hs
- compiler/GHC/Tc/Solver.hs
- compiler/GHC/Utils/TmpFs.hs
- rts/js/rts.js
- + testsuite/tests/stranal/should_compile/T22997.hs
- testsuite/tests/stranal/should_compile/all.T
- + testsuite/tests/typecheck/should_fail/T19627.hs
- + testsuite/tests/typecheck/should_fail/T19627.stderr
- testsuite/tests/typecheck/should_fail/all.T


Changes:

=====================================
compiler/GHC/Core/Opt/DmdAnal.hs
=====================================
@@ -1916,10 +1916,11 @@ finaliseArgBoxities env fn threshold_arity rhs_dmds div rhs
 
   -- Check for an OPAQUE function: see Note [OPAQUE pragma]
   -- In that case, trim off all boxity info from argument demands
+  -- and demand info on lambda binders
   -- See Note [The OPAQUE pragma and avoiding the reboxing of arguments]
   | isOpaquePragma (idInlinePragma fn)
   , let trimmed_rhs_dmds = map trimBoxity rhs_dmds
-  = (trimmed_rhs_dmds, add_demands trimmed_rhs_dmds rhs)
+  = (trimmed_rhs_dmds, set_lam_dmds trimmed_rhs_dmds rhs)
 
   -- Check that we have enough visible binders to match the
   -- threshold arity; if not, we won't do worker/wrapper
@@ -1939,8 +1940,8 @@ finaliseArgBoxities env fn threshold_arity rhs_dmds div rhs
     --   vcat [text "function:" <+> ppr fn
     --        , text "dmds before:" <+> ppr (map idDemandInfo (filter isId bndrs))
     --        , text "dmds after: " <+>  ppr arg_dmds' ]) $
-    (arg_dmds', add_demands arg_dmds' rhs)
-    -- add_demands: we must attach the final boxities to the lambda-binders
+    (arg_dmds', set_lam_dmds arg_dmds' rhs)
+    -- set_lam_dmds: we must attach the final boxities to the lambda-binders
     -- of the function, both because that's kosher, and because CPR analysis
     -- uses the info on the binders directly.
   where
@@ -2032,17 +2033,18 @@ finaliseArgBoxities env fn threshold_arity rhs_dmds div rhs
                  | positiveTopBudget bg_inner' = (bg_inner', dmd')
                  | otherwise                   = (bg_inner,  trimBoxity dmd)
 
-    add_demands :: [Demand] -> CoreExpr -> CoreExpr
+    set_lam_dmds :: [Demand] -> CoreExpr -> CoreExpr
     -- Attach the demands to the outer lambdas of this expression
-    add_demands [] e = e
-    add_demands (dmd:dmds) (Lam v e)
-      | isTyVar v = Lam v (add_demands (dmd:dmds) e)
-      | otherwise = Lam (v `setIdDemandInfo` dmd) (add_demands dmds e)
-    add_demands dmds (Cast e co) = Cast (add_demands dmds e) co
+    set_lam_dmds (dmd:dmds) (Lam v e)
+      | isTyVar v = Lam v (set_lam_dmds (dmd:dmds) e)
+      | otherwise = Lam (v `setIdDemandInfo` dmd) (set_lam_dmds dmds e)
+    set_lam_dmds dmds (Cast e co) = Cast (set_lam_dmds dmds e) co
        -- This case happens for an OPAQUE function, which may look like
        --     f = (\x y. blah) |> co
        -- We give it strictness but no boxity (#22502)
-    add_demands dmds e = pprPanic "add_demands" (ppr dmds $$ ppr e)
+    set_lam_dmds _ e = e
+       -- In the OPAQUE case, the list of demands at this point might be
+       -- non-empty, e.g., when looking at a PAP. Hence don't panic (#22997).
 
 finaliseLetBoxity
   :: AnalEnv


=====================================
compiler/GHC/Linker/Static.hs
=====================================
@@ -126,7 +126,7 @@ linkBinary' staticLink logger tmpfs dflags unit_env o_files dep_units = do
       if gopt Opt_SingleLibFolder dflags
       then do
         libs <- getLibs namever ways_ unit_env dep_units
-        tmpDir <- newTempDir logger tmpfs (tmpDir dflags)
+        tmpDir <- newTempSubDir logger tmpfs (tmpDir dflags)
         sequence_ [ copyFile lib (tmpDir </> basename)
                   | (lib, basename) <- libs]
         return [ "-L" ++ tmpDir ]


=====================================
compiler/GHC/Tc/Solver.hs
=====================================
@@ -2371,7 +2371,7 @@ any new unifications, and iterate the implications only if so.
 -}
 
 {- Note [Expanding Recursive Superclasses and ExpansionFuel]
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 Consider the class declaration (T21909)
 
     class C [a] => C a where
@@ -2431,7 +2431,7 @@ There are two preconditions for the default fuel values:
 Precondition (1) ensures that we expand givens at least as many times as we expand wanted constraints
 preferably givenFuel > wantedsFuel to avoid issues like T21909 while
 the precondition (2) ensures that we do not reach the solver iteration limit and fail with a
-more meaningful error message
+more meaningful error message (see T19627)
 
 This also applies for quantified constraints; see `-fqcs-fuel` compiler flag and `QCI.qci_pend_sc` field.
 -}


=====================================
compiler/GHC/Utils/TmpFs.hs
=====================================
@@ -6,8 +6,8 @@ module GHC.Utils.TmpFs
     , initTmpFs
     , forkTmpFsFrom
     , mergeTmpFsInto
-    , FilesToClean(..)
-    , emptyFilesToClean
+    , PathsToClean(..)
+    , emptyPathsToClean
     , TempFileLifetime(..)
     , TempDir (..)
     , cleanTempDirs
@@ -17,7 +17,7 @@ module GHC.Utils.TmpFs
     , changeTempFilesLifetime
     , newTempName
     , newTempLibName
-    , newTempDir
+    , newTempSubDir
     , withSystemTempDirectory
     , withTempDirectory
     )
@@ -63,25 +63,29 @@ data TmpFs = TmpFs
       --
       -- Shared with forked TmpFs.
 
-  , tmp_files_to_clean :: IORef FilesToClean
+  , tmp_files_to_clean :: IORef PathsToClean
       -- ^ Files to clean (per session or per module)
       --
       -- Not shared with forked TmpFs.
+  , tmp_subdirs_to_clean :: IORef PathsToClean
+      -- ^ Subdirs to clean (per session or per module)
+      --
+      -- Not shared with forked TmpFs.
   }
 
--- | A collection of files that must be deleted before ghc exits.
-data FilesToClean = FilesToClean
-    { ftcGhcSession :: !(Set FilePath)
-        -- ^ Files that will be deleted at the end of runGhc(T)
+-- | A collection of paths that must be deleted before ghc exits.
+data PathsToClean = PathsToClean
+    { ptcGhcSession :: !(Set FilePath)
+        -- ^ Paths that will be deleted at the end of runGhc(T)
 
-    , ftcCurrentModule :: !(Set FilePath)
-        -- ^ Files that will be deleted the next time
+    , ptcCurrentModule :: !(Set FilePath)
+        -- ^ Paths that will be deleted the next time
         -- 'cleanCurrentModuleTempFiles' is called, or otherwise at the end of
         -- the session.
     }
 
 -- | Used when a temp file is created. This determines which component Set of
--- FilesToClean will get the temp file
+-- PathsToClean will get the temp file
 data TempFileLifetime
   = TFL_CurrentModule
   -- ^ A file with lifetime TFL_CurrentModule will be cleaned up at the
@@ -93,38 +97,45 @@ data TempFileLifetime
 
 newtype TempDir = TempDir FilePath
 
--- | An empty FilesToClean
-emptyFilesToClean :: FilesToClean
-emptyFilesToClean = FilesToClean Set.empty Set.empty
+-- | An empty PathsToClean
+emptyPathsToClean :: PathsToClean
+emptyPathsToClean = PathsToClean Set.empty Set.empty
 
--- | Merge two FilesToClean
-mergeFilesToClean :: FilesToClean -> FilesToClean -> FilesToClean
-mergeFilesToClean x y = FilesToClean
-    { ftcGhcSession    = Set.union (ftcGhcSession x) (ftcGhcSession y)
-    , ftcCurrentModule = Set.union (ftcCurrentModule x) (ftcCurrentModule y)
+-- | Merge two PathsToClean
+mergePathsToClean :: PathsToClean -> PathsToClean -> PathsToClean
+mergePathsToClean x y = PathsToClean
+    { ptcGhcSession    = Set.union (ptcGhcSession x) (ptcGhcSession y)
+    , ptcCurrentModule = Set.union (ptcCurrentModule x) (ptcCurrentModule y)
     }
 
 -- | Initialise an empty TmpFs
 initTmpFs :: IO TmpFs
 initTmpFs = do
-    files <- newIORef emptyFilesToClean
-    dirs  <- newIORef Map.empty
-    next  <- newIORef 0
+    files   <- newIORef emptyPathsToClean
+    subdirs <- newIORef emptyPathsToClean
+    dirs    <- newIORef Map.empty
+    next    <- newIORef 0
     return $ TmpFs
-        { tmp_files_to_clean = files
-        , tmp_dirs_to_clean  = dirs
-        , tmp_next_suffix    = next
+        { tmp_files_to_clean   = files
+        , tmp_subdirs_to_clean = subdirs
+        , tmp_dirs_to_clean    = dirs
+        , tmp_next_suffix      = next
         }
 
 -- | Initialise an empty TmpFs sharing unique numbers and per-process temporary
 -- directories with the given TmpFs
+--
+-- It's not safe to use the subdirs created by the original TmpFs with the
+-- forked one. Use @newTempSubDir@ to create new subdirs instead.
 forkTmpFsFrom :: TmpFs -> IO TmpFs
 forkTmpFsFrom old = do
-    files <- newIORef emptyFilesToClean
+    files <- newIORef emptyPathsToClean
+    subdirs <- newIORef emptyPathsToClean
     return $ TmpFs
-        { tmp_files_to_clean = files
-        , tmp_dirs_to_clean  = tmp_dirs_to_clean old
-        , tmp_next_suffix    = tmp_next_suffix old
+        { tmp_files_to_clean   = files
+        , tmp_subdirs_to_clean = subdirs
+        , tmp_dirs_to_clean    = tmp_dirs_to_clean old
+        , tmp_next_suffix      = tmp_next_suffix old
         }
 
 -- | Merge the first TmpFs into the second.
@@ -132,8 +143,11 @@ forkTmpFsFrom old = do
 -- The first TmpFs is returned emptied.
 mergeTmpFsInto :: TmpFs -> TmpFs -> IO ()
 mergeTmpFsInto src dst = do
-    src_files <- atomicModifyIORef' (tmp_files_to_clean src) (\s -> (emptyFilesToClean, s))
-    atomicModifyIORef' (tmp_files_to_clean dst) (\s -> (mergeFilesToClean src_files s, ()))
+    src_files <- atomicModifyIORef' (tmp_files_to_clean src) (\s -> (emptyPathsToClean, s))
+    src_subdirs <- atomicModifyIORef' (tmp_subdirs_to_clean src) (\s -> (emptyPathsToClean, s))
+    atomicModifyIORef' (tmp_files_to_clean dst) (\s -> (mergePathsToClean src_files s, ()))
+    atomicModifyIORef' (tmp_subdirs_to_clean dst) (\s -> (mergePathsToClean src_subdirs s, ()))
+
 
 cleanTempDirs :: Logger -> TmpFs -> IO ()
 cleanTempDirs logger tmpfs
@@ -142,64 +156,78 @@ cleanTempDirs logger tmpfs
         ds <- atomicModifyIORef' ref $ \ds -> (Map.empty, ds)
         removeTmpDirs logger (Map.elems ds)
 
--- | Delete all files in @tmp_files_to_clean at .
+-- | Delete all paths in @tmp_files_to_clean@ and @tmp_subdirs_to_clean at .
 cleanTempFiles :: Logger -> TmpFs -> IO ()
 cleanTempFiles logger tmpfs
    = mask_
-   $ do let ref = tmp_files_to_clean tmpfs
-        to_delete <- atomicModifyIORef' ref $
-            \FilesToClean
-                { ftcCurrentModule = cm_files
-                , ftcGhcSession = gs_files
-                } -> ( emptyFilesToClean
-                     , Set.toList cm_files ++ Set.toList gs_files)
-        removeTmpFiles logger to_delete
-
--- | Delete all files in @tmp_files_to_clean at . That have lifetime
--- TFL_CurrentModule.
+   $ do removeWith (removeTmpFiles logger) (tmp_files_to_clean tmpfs)
+        removeWith (removeTmpSubdirs logger) (tmp_subdirs_to_clean tmpfs)
+  where
+    removeWith remove ref = do
+      to_delete <- atomicModifyIORef' ref $
+        \PathsToClean
+            { ptcCurrentModule = cm_paths
+            , ptcGhcSession = gs_paths
+            } -> ( emptyPathsToClean
+                  , Set.toList cm_paths ++ Set.toList gs_paths)
+      remove to_delete
+
+-- | Delete all paths in @tmp_files_to_clean@ and @tmp_subdirs_to_clean@
+-- That have lifetime TFL_CurrentModule.
 -- If a file must be cleaned eventually, but must survive a
 -- cleanCurrentModuleTempFiles, ensure it has lifetime TFL_GhcSession.
 cleanCurrentModuleTempFiles :: Logger -> TmpFs -> IO ()
 cleanCurrentModuleTempFiles logger tmpfs
    = mask_
-   $ do let ref = tmp_files_to_clean tmpfs
+   $ do removeWith (removeTmpFiles logger) (tmp_files_to_clean tmpfs)
+        removeWith (removeTmpSubdirs logger) (tmp_subdirs_to_clean tmpfs)
+  where
+    removeWith remove ref = do
         to_delete <- atomicModifyIORef' ref $
-            \ftc at FilesToClean{ftcCurrentModule = cm_files} ->
-                (ftc {ftcCurrentModule = Set.empty}, Set.toList cm_files)
-        removeTmpFiles logger to_delete
+            \ptc at PathsToClean{ptcCurrentModule = cm_paths} ->
+                (ptc {ptcCurrentModule = Set.empty}, Set.toList cm_paths)
+        remove to_delete
 
 -- | Ensure that new_files are cleaned on the next call of
 -- 'cleanTempFiles' or 'cleanCurrentModuleTempFiles', depending on lifetime.
 -- If any of new_files are already tracked, they will have their lifetime
 -- updated.
 addFilesToClean :: TmpFs -> TempFileLifetime -> [FilePath] -> IO ()
-addFilesToClean tmpfs lifetime new_files = modifyIORef' (tmp_files_to_clean tmpfs) $
-  \FilesToClean
-    { ftcCurrentModule = cm_files
-    , ftcGhcSession = gs_files
+addFilesToClean tmpfs lifetime new_files =
+  addToClean (tmp_files_to_clean tmpfs) lifetime new_files
+
+addSubdirsToClean :: TmpFs -> TempFileLifetime -> [FilePath] -> IO ()
+addSubdirsToClean tmpfs lifetime new_subdirs =
+  addToClean (tmp_subdirs_to_clean tmpfs) lifetime new_subdirs
+
+addToClean :: IORef PathsToClean -> TempFileLifetime -> [FilePath] -> IO ()
+addToClean ref lifetime new_filepaths = modifyIORef' ref $
+  \PathsToClean
+    { ptcCurrentModule = cm_paths
+    , ptcGhcSession = gs_paths
     } -> case lifetime of
-      TFL_CurrentModule -> FilesToClean
-        { ftcCurrentModule = cm_files `Set.union` new_files_set
-        , ftcGhcSession = gs_files `Set.difference` new_files_set
+      TFL_CurrentModule -> PathsToClean
+        { ptcCurrentModule = cm_paths `Set.union` new_filepaths_set
+        , ptcGhcSession = gs_paths `Set.difference` new_filepaths_set
         }
-      TFL_GhcSession -> FilesToClean
-        { ftcCurrentModule = cm_files `Set.difference` new_files_set
-        , ftcGhcSession = gs_files `Set.union` new_files_set
+      TFL_GhcSession -> PathsToClean
+        { ptcCurrentModule = cm_paths `Set.difference` new_filepaths_set
+        , ptcGhcSession = gs_paths `Set.union` new_filepaths_set
         }
   where
-    new_files_set = Set.fromList new_files
+    new_filepaths_set = Set.fromList new_filepaths
 
 -- | Update the lifetime of files already being tracked. If any files are
 -- not being tracked they will be discarded.
 changeTempFilesLifetime :: TmpFs -> TempFileLifetime -> [FilePath] -> IO ()
 changeTempFilesLifetime tmpfs lifetime files = do
-  FilesToClean
-    { ftcCurrentModule = cm_files
-    , ftcGhcSession = gs_files
+  PathsToClean
+    { ptcCurrentModule = cm_paths
+    , ptcGhcSession = gs_paths
     } <- readIORef (tmp_files_to_clean tmpfs)
   let old_set = case lifetime of
-        TFL_CurrentModule -> gs_files
-        TFL_GhcSession -> cm_files
+        TFL_CurrentModule -> gs_paths
+        TFL_GhcSession -> cm_paths
       existing_files = [f | f <- files, f `Set.member` old_set]
   addFilesToClean tmpfs lifetime existing_files
 
@@ -224,20 +252,32 @@ newTempName logger tmpfs tmp_dir lifetime extn
                         addFilesToClean tmpfs lifetime [filename]
                         return filename
 
-newTempDir :: Logger -> TmpFs -> TempDir -> IO FilePath
-newTempDir logger tmpfs tmp_dir
+-- | Create a new temporary subdirectory that doesn't already exist
+-- The temporary subdirectory is automatically removed at the end of the
+-- GHC session, but its contents aren't. Make sure to leave the directory
+-- empty before the end of the session, either by removing content
+-- directly or by using @addFilesToClean at .
+--
+-- If the created subdirectory is not empty, it will not be removed (along
+-- with its parent temporary directory) and a warning message will be
+-- printed at verbosity 2 and higher.
+newTempSubDir :: Logger -> TmpFs -> TempDir -> IO FilePath
+newTempSubDir logger tmpfs tmp_dir
   = do d <- getTempDir logger tmpfs tmp_dir
        findTempDir (d </> "ghc_")
   where
     findTempDir :: FilePath -> IO FilePath
     findTempDir prefix
       = do n <- newTempSuffix tmpfs
-           let filename = prefix ++ show n
-           b <- doesDirectoryExist filename
+           let name = prefix ++ show n
+           b <- doesDirectoryExist name
            if b then findTempDir prefix
-                else do createDirectory filename
-                        -- see mkTempDir below; this is wrong: -> consIORef (tmp_dirs_to_clean tmpfs) filename
-                        return filename
+                else (do
+                  createDirectory name
+                  addSubdirsToClean tmpfs TFL_GhcSession [name]
+                  return name)
+            `Exception.catchIO` \e -> if isAlreadyExistsError e
+                  then findTempDir prefix else ioError e
 
 newTempLibName :: Logger -> TmpFs -> TempDir -> TempFileLifetime -> Suffix
   -> IO (FilePath, FilePath, String)
@@ -338,6 +378,12 @@ removeTmpFiles logger fs
 
     (non_deletees, deletees) = partition isHaskellUserSrcFilename fs
 
+removeTmpSubdirs :: Logger -> [FilePath] -> IO ()
+removeTmpSubdirs logger fs
+  = traceCmd logger "Deleting temp subdirs"
+             ("Deleting: " ++ unwords fs)
+             (mapM_ (removeWith logger removeDirectory) fs)
+
 removeWith :: Logger -> (FilePath -> IO ()) -> FilePath -> IO ()
 removeWith logger remover f = remover f `Exception.catchIO`
   (\e ->


=====================================
rts/js/rts.js
=====================================
@@ -365,14 +365,7 @@ function h$printReg(r) {
     } else if(r.f.t === h$ct_blackhole && r.x) {
       return ("blackhole: -> " + h$printReg({ f: r.x.x1, d: r.d1.x2 }) + ")");
     } else {
-      var iv = "";
-      if(r.f.n === "integer-gmp:GHC.Integer.Type.Jp#" ||
-      r.f.n === "integer-gmp:GHC.Integer.Type.Jn#") {
-        iv = ' [' + r.d1.join(',') + '](' + h$ghcjsbn_showBase(r.d1, 10) + ')'
-      } else if(r.f.n === "integer-gmp:GHC.Integer.Type.S#") {
-        iv = ' (S: ' + r.d1 + ')';
-      }
-      return ((r.alloc ? r.alloc + ': ' : '') + r.f.n + " (" + h$closureTypeName(r.f.t) + ", " + r.f.a + ")" + iv);
+      return ((r.alloc ? r.alloc + ': ' : '') + r.f.n + " (" + h$closureTypeName(r.f.t) + ", " + r.f.a + ")");
     }
   } else if(typeof r === 'object') {
     var res = h$collectProps(r);
@@ -536,14 +529,7 @@ function h$dumpStackTop(stack, start, sp) {
           if(s.f.t === h$ct_blackhole && s.d1 && s.d1.x1 && s.d1.x1.n) {
             h$log("stack[" + i + "] = blackhole -> " + s.d1.x1.n);
           } else {
-            var iv = "";
-            if(s.f.n === "integer-gmp:GHC.Integer.Type.Jp#" ||
-            s.f.n === "integer-gmp:GHC.Integer.Type.Jn#") {
-              iv = ' [' + s.d1.join(',') + '](' + h$ghcjsbn_showBase(s.d1, 10) + ')'
-            } else if(s.f.n === "integer-gmp:GHC.Integer.Type.S#") {
-              iv = ' (S: ' + s.d1 + ')';
-            }
-            h$log("stack[" + i + "] = -> " + (s.alloc ? s.alloc + ': ' : '') + s.f.n + " (" + h$closureTypeName(s.f.t) + ", a: " + s.f.a + ")" + iv);
+            h$log("stack[" + i + "] = -> " + (s.alloc ? s.alloc + ': ' : '') + s.f.n + " (" + h$closureTypeName(s.f.t) + ", a: " + s.f.a + ")");
           }
         }
       } else if(h$isInstanceOf(s,h$MVar)) {


=====================================
testsuite/tests/stranal/should_compile/T22997.hs
=====================================
@@ -0,0 +1,9 @@
+module T22997 where
+
+{-# OPAQUE trivial #-}
+trivial :: Int -> Int
+trivial = succ
+
+{-# OPAQUE pap #-}
+pap :: Integer -> Integer
+pap = (42 +)


=====================================
testsuite/tests/stranal/should_compile/all.T
=====================================
@@ -88,3 +88,5 @@ test('EtaExpansion', normal, compile, [''])
 test('T22039', normal, compile, [''])
 # T22388: Should see $winteresting but not $wboring
 test('T22388', [ grep_errmsg(r'^\S+\$w\S+') ], compile, ['-dsuppress-uniques -ddump-simpl'])
+# T22997: Just a panic that should not happen
+test('T22997', normal, compile, [''])


=====================================
testsuite/tests/typecheck/should_fail/T19627.hs
=====================================
@@ -0,0 +1,108 @@
+{-# language BlockArguments #-}
+{-# language DefaultSignatures #-}
+{-# language DerivingStrategies #-}
+{-# language EmptyCase #-}
+{-# language ExplicitNamespaces #-}
+{-# language ImportQualifiedPost #-}
+{-# language FlexibleContexts #-}
+{-# language FlexibleInstances #-}
+{-# language FunctionalDependencies #-}
+{-# language GADTs #-}
+{-# language LambdaCase #-}
+{-# language LinearTypes #-}
+{-# language NoStarIsType #-}
+{-# language PolyKinds #-}
+{-# language QuantifiedConstraints #-}
+{-# language RankNTypes #-}
+{-# language RoleAnnotations #-}
+{-# language ScopedTypeVariables #-}
+{-# language StandaloneDeriving #-}
+{-# language StandaloneKindSignatures #-}
+{-# language StrictData #-}
+{-# language TupleSections #-}
+{-# language TypeApplications #-}
+{-# language TypeFamilies #-}
+{-# language TypeFamilyDependencies #-}
+{-# language TypeOperators #-}
+{-# language UndecidableInstances #-}
+{-# language UndecidableSuperClasses #-}
+
+module T19627 where
+
+import Data.Kind
+import Prelude hiding ( Functor(..) )
+
+--------------------
+
+class (Prop (Not p), Not (Not p) ~ p) => Prop (p :: Type) where
+  type Not p :: Type
+  (!=) :: p -> Not p -> r
+
+data Y (a :: Type) (b :: Type) (c :: Type) where
+  L :: Y a b a
+  R :: Y a b b
+
+newtype a & b = With (forall c. Y a b c -> c)
+
+with :: (forall c. Y a b c -> c) -> a & b
+with = With
+
+runWith :: a & b -> Y a b c -> c
+runWith (With f) = f
+
+withL' :: a & b -> a
+withL' (With f) = f L
+
+withR' :: a & b -> b
+withR' (With f) = f R
+
+instance (Prop a, Prop b) => Prop (a & b) where
+  type Not (a & b) = Not a `Either` Not b
+  w != Left a  = withL' w != a
+  w != Right b = withR' w != b
+
+instance (Prop a, Prop b) => Prop (Either a b) where
+  type Not (Either a b) = Not a & Not b
+  Left a  != w = a != withL' w
+  Right a != w = a != withR' w
+
+newtype Yoneda f a = Yoneda
+  (forall r. Prop r => (a -> r) -> f r)
+
+data Noneda f a where
+  Noneda :: Prop r => !(f r <#- (a ⊸ r)) -> Noneda f a
+
+liftYoneda :: forall f a i. (Functor f, Prop a, Iso i) => i (f a) (Yoneda f a)
+liftYoneda = iso \case
+  L -> lowerYoneda'
+  R -> lol \case
+    L -> \(Noneda ((a2r :: a ⊸ r) :-#> nfr)) -> runLol (fmap @f @a @r a2r) L nfr
+    R -> \fa -> Yoneda do
+      lol \case
+        R -> \f -> fmap' f fa
+        L -> \nfr -> whyNot \a2r -> fmap a2r fa != nfr
+
+
+type family NotApart (p :: Type -> Type -> Type) :: Type -> Type -> Type
+
+class
+  ( forall a b. (Prop a, Prop b) => Prop (p a b)
+  , NotApart (NotIso p) ~ p
+  ) => Iso p where
+  type NotIso p = (q :: Type -> Type -> Type) | q -> p
+  iso :: (forall c. Y (b ⊸ a) (a ⊸ b) c -> c) -> p a b
+
+data b <#- a where (:-#>) :: a -> Not b -> b <#- a
+newtype a ⊸ b = Lol (forall c. Y (Not b %1 -> Not a) (a %1 -> b) c -> c)
+
+class
+  ( forall a. Prop a => Prop (f a)
+  ) => Functor f where
+  fmap' :: (Prop a, Prop b, Lol l, Lol l') => l ((a ⊸ b)) (l' (f a) (f b))
+
+fmap :: forall f a b l. (Functor f, Prop a, Prop b, Lol l) => (a ⊸ b) -> l (f a) (f b)
+fmap f = fmap' f
+
+class Iso p => Lol (p :: Type -> Type -> Type) where
+  lol :: (forall c. Y (Not b -> Not a) (a -> b) c -> c) -> p a b
+  apartR :: Not (p a b) -> b <#- a


=====================================
testsuite/tests/typecheck/should_fail/T19627.stderr
=====================================
@@ -0,0 +1,45 @@
+
+T19627.hs:108:3: error: [GHC-05617]
+    • Could not deduce ‘Not (p0 a b) ~ Not (p a b)’
+      from the context: Lol p
+        bound by the type signature for:
+                   apartR :: forall (p :: * -> * -> *) a b.
+                             Lol p =>
+                             Not (p a b) -> b <#- a
+        at T19627.hs:108:3-34
+      Expected: Not (p a b) -> b <#- a
+        Actual: Not (p0 a b) -> b <#- a
+        NB: ‘Not’ is a non-injective type family
+        The type variable ‘p0’ is ambiguous
+    • In the ambiguity check for ‘apartR’
+      To defer the ambiguity check to use sites, enable AllowAmbiguousTypes
+      When checking the class method:
+        apartR :: forall (p :: * -> * -> *) a b.
+                  Lol p =>
+                  Not (p a b) -> b <#- a
+      In the class declaration for ‘Lol’
+
+T19627.hs:108:3: error: [GHC-05617]
+    • Could not deduce ‘Not (Not (p0 a1 b1)) ~ p0 a1 b1’
+        arising from a superclass required to satisfy ‘Prop (p0 a1 b1)’,
+        arising from the head of a quantified constraint
+        arising from a superclass required to satisfy ‘Iso p0’,
+        arising from a superclass required to satisfy ‘Lol p0’,
+        arising from a type ambiguity check for
+        the type signature for ‘apartR’
+      from the context: Lol p
+        bound by the type signature for:
+                   apartR :: forall (p :: * -> * -> *) a b.
+                             Lol p =>
+                             Not (p a b) -> b <#- a
+        at T19627.hs:108:3-34
+      or from: (Prop a1, Prop b1)
+        bound by a quantified context at T19627.hs:108:3-34
+        The type variable ‘p0’ is ambiguous
+    • In the ambiguity check for ‘apartR’
+      To defer the ambiguity check to use sites, enable AllowAmbiguousTypes
+      When checking the class method:
+        apartR :: forall (p :: * -> * -> *) a b.
+                  Lol p =>
+                  Not (p a b) -> b <#- a
+      In the class declaration for ‘Lol’


=====================================
testsuite/tests/typecheck/should_fail/all.T
=====================================
@@ -671,3 +671,4 @@ test('T20666a', normal, compile, [''])  # To become compile_fail after migration
 test('T22924a', normal, compile_fail, [''])
 test('T22924b', normal, compile_fail, [''])
 test('T22940', normal, compile_fail, [''])
+test('T19627', normal, compile_fail, [''])



View it on GitLab: https://gitlab.haskell.org/ghc/ghc/-/compare/ce9e4e731952339cd9a1a52681bc838fb8778edc...32163a1893061a4cd22974cf27280d68c51b2861

-- 
View it on GitLab: https://gitlab.haskell.org/ghc/ghc/-/compare/ce9e4e731952339cd9a1a52681bc838fb8778edc...32163a1893061a4cd22974cf27280d68c51b2861
You're receiving this email because of your account on gitlab.haskell.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.haskell.org/pipermail/ghc-commits/attachments/20230310/8b803bcf/attachment-0001.html>


More information about the ghc-commits mailing list