[Git][ghc/ghc][wip/hadrian-windows-bindist] 26 commits: base: Add changelog entry for CLC #188
Matthew Pickering (@mpickering)
gitlab at gitlab.haskell.org
Wed Aug 16 12:11:23 UTC 2023
Matthew Pickering pushed to branch wip/hadrian-windows-bindist at Glasgow Haskell Compiler / GHC
Commits:
8e699b23 by Matthew Pickering at 2023-08-14T10:44:47-04:00
base: Add changelog entry for CLC #188
This proposal modified the implementations of copyBytes, moveBytes and
fillBytes (as detailed in the proposal)
https://github.com/haskell/core-libraries-committee/issues/188
- - - - -
026f040a by Matthew Pickering at 2023-08-14T10:45:23-04:00
packaging: Build manpage in separate directory to other documentation
We were installing two copies of the manpage:
* One useless one in the `share/doc` folder, because we copy the doc/
folder into share/
* The one we deliberately installed into `share/man` etc
The solution is to build the manpage into the `manpage` directory when
building the bindist, and then just install it separately.
Fixes #23707
- - - - -
524c60c8 by Bartłomiej Cieślar at 2023-08-14T13:46:33-04:00
Report deprecated fields bound by record wildcards when used
This commit ensures that we emit the appropriate warnings when
a deprecated record field bound by a record wildcard is used.
For example:
module A where
data Foo = Foo {x :: Int, y :: Bool, z :: Char}
{-# DEPRECATED x "Don't use x" #-}
{-# WARNING y "Don't use y" #-}
module B where
import A
foo (Foo {..}) = x
This will cause us to emit a "Don't use x" warning, with location the
location of the record wildcard. Note that we don't warn about `y`,
because it is unused in the RHS of `foo`.
Fixes #23382
- - - - -
d6130065 by Matthew Pickering at 2023-08-14T13:47:11-04:00
Add zstd suffix to jobs which rely on zstd
This was causing some confusion as the job was named simply
"x86_64-linux-deb10-validate", which implies a standard configuration
rather than any dependency on libzstd.
- - - - -
e24e44fc by Matthew Pickering at 2023-08-14T13:47:11-04:00
ci: Always run project-version job
This is needed for the downstream test-primops pipeline to workout what
the version of a bindist produced by a pipeline is.
- - - - -
f17b9d62 by Matthew Pickering at 2023-08-14T13:47:11-04:00
gen_ci: Rework how jobs-metadata.json is generated
* We now represent a job group a triple of Maybes, which makes it easier
to work out when jobs are enabled/disabled on certain pipelines.
```
data JobGroup a = StandardTriple { v :: Maybe (NamedJob a)
, n :: Maybe (NamedJob a)
, r :: Maybe (NamedJob a) }
```
* `jobs-metadata.json` generation is reworked using the following
algorithm.
- For each pipeline type, find all the platforms we are doing builds
for.
- Select one build per platform
- Zip together the results
This way we can choose different pipelines for validate/nightly/release
which makes the metadata also useful for validate pipelines. This
feature is used by the test-primops downstream CI in order to select the
right bindist for testing validate pipelines.
This makes it easier to inspect which jobs are going to be enabled on a
particular pipeline.
- - - - -
f9a5563d by Matthew Pickering at 2023-08-14T13:47:11-04:00
gen_ci: Rules rework
In particular we now distinguish between whether we are dealing with a
Nightly/Release pipeline (which labels don't matter for) and a validate
pipeline where labels do matter.
The overall goal here is to allow a disjunction of labels for validate
pipelines, for example,
> Run a job if we have the full-ci label or test-primops label
Therefore the "ValidateOnly" rules are treated as a set of disjunctions
rather than conjunctions like before.
What this means in particular is that if we want to ONLY run a job if a
label is set, for example, "FreeBSD" label then we have to override the
whole label set.
Fixes #23772
- - - - -
d54b0c1d by Matthew Pickering at 2023-08-14T13:47:11-04:00
ci: set -e for lint-ci-config scripts
- - - - -
994a9b35 by Matthew Pickering at 2023-08-14T13:47:11-04:00
ci: Fix job metadata generation
- - - - -
ffd32d9e by Matthew Pickering at 2023-08-16T13:06:23+01:00
hadrian: Add reloc-binary-dist-* targets
The idea here is that on windows we want to ship and already "installed"
bindist. So the way to achieve this is to build a noramal bindist, and
then install it before zipping that up.
- - - - -
f7bbaaa9 by Matthew Pickering at 2023-08-16T13:06:36+01:00
Windows testing
- - - - -
b38ac7fb by Matthew Pickering at 2023-08-16T13:06:36+01:00
Windows testing
- - - - -
26b70a64 by Matthew Pickering at 2023-08-16T13:06:36+01:00
Move setup earlier
- - - - -
ba1b75ae by Matthew Pickering at 2023-08-16T13:06:36+01:00
another attempt
- - - - -
79e5a75c by Matthew Pickering at 2023-08-16T13:06:36+01:00
fix
- - - - -
ced313f0 by Matthew Pickering at 2023-08-16T13:06:36+01:00
debugging
- - - - -
5b018439 by Matthew Pickering at 2023-08-16T13:06:36+01:00
wip
- - - - -
b9c09e28 by Matthew Pickering at 2023-08-16T13:06:36+01:00
fix paths
- - - - -
f301efc8 by Matthew Pickering at 2023-08-16T13:06:36+01:00
fix paths
- - - - -
2e5b3ba0 by Matthew Pickering at 2023-08-16T13:06:36+01:00
Fix relocatable build
- - - - -
0dfc701c by Matthew Pickering at 2023-08-16T13:06:37+01:00
fixes
- - - - -
65ed306c by Matthew Pickering at 2023-08-16T13:06:37+01:00
fix
- - - - -
c00b87f9 by Matthew Pickering at 2023-08-16T13:06:37+01:00
fix
- - - - -
dbe30e9f by Matthew Pickering at 2023-08-16T13:06:37+01:00
fix
- - - - -
121de723 by Matthew Pickering at 2023-08-16T13:06:37+01:00
fix
- - - - -
f9a22fd3 by Matthew Pickering at 2023-08-16T13:06:37+01:00
fix
- - - - -
29 changed files:
- .gitlab-ci.yml
- .gitlab/ci.sh
- .gitlab/generate-ci/gen_ci.hs
- .gitlab/generate-ci/generate-job-metadata
- .gitlab/generate-ci/generate-jobs
- .gitlab/jobs.yaml
- compiler/GHC/Hs/Utils.hs
- compiler/GHC/Rename/Bind.hs
- compiler/GHC/Rename/Env.hs
- compiler/GHC/Rename/Expr.hs
- compiler/GHC/Rename/Pat.hs
- compiler/GHC/Rename/Utils.hs
- compiler/GHC/Tc/Gen/Expr.hs
- compiler/GHC/Types/Name/Set.hs
- configure.ac
- distrib/configure.ac.in
- hadrian/bindist/Makefile
- hadrian/bindist/config.mk.in
- hadrian/src/Builder.hs
- hadrian/src/Rules/BinaryDist.hs
- hadrian/src/Rules/Documentation.hs
- libraries/base/changelog.md
- + libraries/ghc-prim/ghc-prim.cabal
- m4/fp_settings.m4
- m4/fp_setup_windows_toolchain.m4
- + testsuite/tests/rename/should_compile/RecordWildCardDeprecation.hs
- + testsuite/tests/rename/should_compile/RecordWildCardDeprecation.stderr
- + testsuite/tests/rename/should_compile/RecordWildCardDeprecation_aux.hs
- testsuite/tests/rename/should_compile/all.T
Changes:
=====================================
.gitlab-ci.yml
=====================================
@@ -1013,9 +1013,6 @@ project-version:
artifacts:
paths:
- version.sh
- rules:
- - if: '$NIGHTLY'
- - if: '$RELEASE_JOB == "yes"'
.ghcup-metadata:
stage: deploy
=====================================
.gitlab/ci.sh
=====================================
@@ -490,8 +490,16 @@ function build_hadrian() {
if [[ -n "${REINSTALL_GHC:-}" ]]; then
run_hadrian build-cabal -V
else
- run_hadrian test:all_deps binary-dist -V
- mv _build/bindist/ghc*.tar.xz "$BIN_DIST_NAME.tar.xz"
+ case "$(uname)" in
+ MSYS_*|MINGW*)
+ run_hadrian test:all_deps reloc-binary-dist -V
+ mv _build/reloc-bindist/ghc*.tar.xz "$BIN_DIST_NAME.tar.xz"
+ ;;
+ *)
+ run_hadrian test:all_deps binary-dist -V
+ mv _build/bindist/ghc*.tar.xz "$BIN_DIST_NAME.tar.xz"
+ ;;
+ esac
fi
}
@@ -543,7 +551,8 @@ function install_bindist() {
run ${CONFIGURE_WRAPPER:-} ./configure \
--prefix="$instdir" \
- "${args[@]+"${args[@]}"}"
+ "${args[@]+"${args[@]}"}" \
+ || ( cat config.log; fail "configure failed" )
make_install_destdir "$TOP"/destdir "$instdir"
;;
esac
=====================================
.gitlab/generate-ci/gen_ci.hs
=====================================
@@ -3,6 +3,7 @@
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE DeriveFunctor #-}
{-# LANGUAGE GeneralizedNewtypeDeriving #-}
+{-# LANGUAGE ViewPatterns #-}
import Data.Aeson as A
import qualified Data.Map as Map
@@ -10,10 +11,9 @@ import Data.Map (Map)
import Data.Maybe
import qualified Data.ByteString.Lazy as B
import qualified Data.ByteString.Lazy.Char8 as B
-import Data.List (intercalate)
-import Data.Set (Set)
import qualified Data.Set as S
import System.Environment
+import Data.List
{-
Note [Generating the CI pipeline]
@@ -314,6 +314,7 @@ testEnv arch opsys bc = intercalate "-" $
++ ["int_" ++ bignumString (bignumBackend bc) | bignumBackend bc /= Gmp]
++ ["unreg" | unregisterised bc ]
++ ["numa" | withNuma bc ]
+ ++ ["zstd" | withZstd bc ]
++ ["no_tntc" | not (tablesNextToCode bc) ]
++ ["cross_"++triple | Just triple <- pure $ crossTarget bc ]
++ [flavourString (mkJobFlavour bc)]
@@ -501,21 +502,34 @@ instance ToJSON ArtifactsWhen where
-----------------------------------------------------------------------------
-- Data structure which records the condition when a job is run.
-data OnOffRules = OnOffRules { rule_set :: Set Rule -- ^ The set of enabled rules
+data OnOffRules = OnOffRules { rule_set :: Rule -- ^ The enabled rules
, when :: ManualFlag -- ^ The additional condition about when to run this job.
}
--- The initial set of rules where all rules are disabled and the job is always run.
+-- The initial set of rules, which assumes a Validate pipeline which is run with FullCI.
emptyRules :: OnOffRules
-emptyRules = OnOffRules S.empty OnSuccess
+emptyRules = OnOffRules (ValidateOnly (S.singleton FullCI)) OnSuccess
-- When to run the job
data ManualFlag = Manual -- ^ Only run the job when explicitly triggered by a user
| OnSuccess -- ^ Always run it, if the rules pass (the default)
deriving Eq
-enableRule :: Rule -> OnOffRules -> OnOffRules
-enableRule r (OnOffRules o m) = OnOffRules (S.insert r o) m
+setRule :: Rule -> OnOffRules -> OnOffRules
+setRule r (OnOffRules _ m) = OnOffRules r m
+
+enableValidateRule :: ValidateRule -> OnOffRules -> OnOffRules
+enableValidateRule r = modifyValidateRules (S.insert r)
+
+onlyValidateRule :: ValidateRule -> OnOffRules -> OnOffRules
+onlyValidateRule r = modifyValidateRules (const (S.singleton r))
+
+removeValidateRule :: ValidateRule -> OnOffRules -> OnOffRules
+removeValidateRule r = modifyValidateRules (S.delete r)
+
+modifyValidateRules :: (S.Set ValidateRule -> S.Set ValidateRule) -> OnOffRules -> OnOffRules
+modifyValidateRules f (OnOffRules (ValidateOnly rs) m) = OnOffRules (ValidateOnly (f rs)) m
+modifyValidateRules _ r = error $ "Applying validate rule to nightly/release job:" ++ show (rule_set r)
manualRule :: OnOffRules -> OnOffRules
manualRule rules = rules { when = Manual }
@@ -524,10 +538,19 @@ manualRule rules = rules { when = Manual }
-- For example, even if you don't explicitly disable a rule it will end up in the
-- rule list with the OFF state.
enumRules :: OnOffRules -> [OnOffRule]
-enumRules o = map lkup rulesList
+enumRules (OnOffRules r _) = rulesList
where
- enabled_rules = rule_set o
- lkup r = OnOffRule (if S.member r enabled_rules then On else Off) r
+ rulesList = case r of
+ ValidateOnly rs -> [OnOffRule On (ValidateOnly rs)
+ , OnOffRule Off ReleaseOnly
+ , OnOffRule Off Nightly ]
+ Nightly -> [ OnOffRule Off (ValidateOnly S.empty)
+ , OnOffRule Off ReleaseOnly
+ , OnOffRule On Nightly ]
+ ReleaseOnly -> [ OnOffRule Off (ValidateOnly S.empty)
+ , OnOffRule On ReleaseOnly
+ , OnOffRule Off Nightly ]
+
data OnOffRule = OnOffRule OnOff Rule
@@ -549,21 +572,29 @@ instance ToJSON OnOffRules where
where
one_rule (OnOffRule onoff r) = ruleString onoff r
- parens s = "(" ++ s ++ ")"
- and_all rs = intercalate " && " (map parens rs)
+
+
+parens :: [Char] -> [Char]
+parens s = "(" ++ s ++ ")"
+and_all :: [[Char]] -> [Char]
+and_all rs = intercalate " && " (map parens rs)
+or_all :: [[Char]] -> [Char]
+or_all rs = intercalate " || " (map parens rs)
-- | A Rule corresponds to some condition which must be satisifed in order to
-- run the job.
-data Rule = FastCI -- ^ Run this job on all validate pipelines, all pipelines are enabled
- -- by the "full-ci" label.
- | ReleaseOnly -- ^ Only run this job in a release pipeline
+data Rule = ReleaseOnly -- ^ Only run this job in a release pipeline
| Nightly -- ^ Only run this job in the nightly pipeline
- | LLVMBackend -- ^ Only run this job when the "LLVM backend" label is present
- | FreeBSDLabel -- ^ Only run this job when the "FreeBSD" label is set.
- | NonmovingGc -- ^ Only run this job when the "non-moving GC" label is set.
- | IpeData -- ^ Only run this job when the "IPE" label is set
- | Disable -- ^ Don't run this job.
- deriving (Bounded, Enum, Ord, Eq)
+ | ValidateOnly (S.Set ValidateRule) -- ^ Only run this job in a validate pipeline, when any of these rules are enabled.
+ deriving (Show, Ord, Eq)
+
+data ValidateRule =
+ FullCI -- ^ Run this job when the "full-ci" label is present.
+ | LLVMBackend -- ^ Run this job when the "LLVM backend" label is present
+ | FreeBSDLabel -- ^ Run this job when the "FreeBSD" label is set.
+ | NonmovingGc -- ^ Run this job when the "non-moving GC" label is set.
+ | IpeData -- ^ Run this job when the "IPE" label is set
+ deriving (Show, Enum, Bounded, Ord, Eq)
-- A constant evaluating to True because gitlab doesn't support "true" in the
-- expression language.
@@ -571,31 +602,30 @@ true :: String
true = "\"true\" == \"true\""
-- A constant evaluating to False because gitlab doesn't support "true" in the
-- expression language.
-false :: String
-false = "\"disabled\" != \"disabled\""
+_false :: String
+_false = "\"disabled\" != \"disabled\""
-- Convert the state of the rule into a string that gitlab understand.
ruleString :: OnOff -> Rule -> String
-ruleString On FastCI = true
-ruleString Off FastCI = "($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/)"
-ruleString On LLVMBackend = "$CI_MERGE_REQUEST_LABELS =~ /.*LLVM backend.*/"
-ruleString Off LLVMBackend = true
-ruleString On FreeBSDLabel = "$CI_MERGE_REQUEST_LABELS =~ /.*FreeBSD.*/"
-ruleString Off FreeBSDLabel = true
-ruleString On NonmovingGc = "$CI_MERGE_REQUEST_LABELS =~ /.*non-moving GC.*/"
-ruleString Off NonmovingGc = true
+ruleString On (ValidateOnly vs) =
+ case S.toList vs of
+ [] -> true
+ conds -> or_all (map validateRuleString conds)
+ruleString Off (ValidateOnly {}) = true
ruleString On ReleaseOnly = "$RELEASE_JOB == \"yes\""
ruleString Off ReleaseOnly = "$RELEASE_JOB != \"yes\""
ruleString On Nightly = "$NIGHTLY"
ruleString Off Nightly = "$NIGHTLY == null"
-ruleString On IpeData = "$CI_MERGE_REQUEST_LABELS =~ /.*IPE.*/"
-ruleString Off IpeData = true
-ruleString On Disable = false
-ruleString Off Disable = true
--- Enumeration of all the rules
-rulesList :: [Rule]
-rulesList = [minBound .. maxBound]
+labelString :: String -> String
+labelString s = "$CI_MERGE_REQUEST_LABELS =~ /.*" ++ s ++ ".*/"
+
+validateRuleString :: ValidateRule -> String
+validateRuleString FullCI = or_all ([labelString "full-ci", labelString "marge_bot_batch_merge_job"])
+validateRuleString LLVMBackend = labelString "LLVM backend"
+validateRuleString FreeBSDLabel = labelString "FreeBSD"
+validateRuleString NonmovingGc = labelString "non-moving GC"
+validateRuleString IpeData = labelString "IPE"
-- | A 'Job' is the description of a single job in a gitlab pipeline. The
-- job contains all the information about how to do the build but can be further
@@ -740,16 +770,28 @@ modifyJobs = fmap
-- | Modify just the validate jobs in a 'JobGroup'
modifyValidateJobs :: (a -> a) -> JobGroup a -> JobGroup a
-modifyValidateJobs f jg = jg { v = f <$> v jg }
+modifyValidateJobs f jg = jg { v = fmap f <$> v jg }
-- | Modify just the nightly jobs in a 'JobGroup'
modifyNightlyJobs :: (a -> a) -> JobGroup a -> JobGroup a
-modifyNightlyJobs f jg = jg { n = f <$> n jg }
+modifyNightlyJobs f jg = jg { n = fmap f <$> n jg }
-- Generic helpers
-addJobRule :: Rule -> Job -> Job
-addJobRule r j = j { jobRules = enableRule r (jobRules j) }
+setJobRule :: Rule -> Job -> Job
+setJobRule r j = j { jobRules = setRule r (jobRules j) }
+
+addValidateJobRule :: ValidateRule -> Job -> Job
+addValidateJobRule r = modifyValidateJobRule (enableValidateRule r)
+
+onlyValidateJobRule :: ValidateRule -> Job -> Job
+onlyValidateJobRule r = modifyValidateJobRule (onlyValidateRule r)
+
+removeValidateJobRule :: ValidateRule -> Job -> Job
+removeValidateJobRule r = modifyValidateJobRule (removeValidateRule r)
+
+modifyValidateJobRule :: (OnOffRules -> OnOffRules) -> Job -> Job
+modifyValidateJobRule f j = j { jobRules = f (jobRules j) }
addVariable :: String -> String -> Job -> Job
addVariable k v j = j { jobVariables = mminsertWith (++) k [v] (jobVariables j) }
@@ -769,10 +811,10 @@ validate = job
-- Nightly and release apply the FastCI configuration to all jobs so that they all run in
-- the pipeline (not conditional on the full-ci label)
nightlyRule :: Job -> Job
-nightlyRule = addJobRule FastCI . addJobRule Nightly
+nightlyRule = setJobRule Nightly
releaseRule :: Job -> Job
-releaseRule = addJobRule FastCI . addJobRule ReleaseOnly
+releaseRule = setJobRule ReleaseOnly
-- | Make a normal nightly CI job
nightly :: Arch -> Opsys -> BuildConfig -> NamedJob Job
@@ -815,7 +857,7 @@ useHashUnitIds = addVariable "HADRIAN_ARGS" "--hash-unit-ids"
-- | Mark the validate job to run in fast-ci mode
-- This is default way, to enable all jobs you have to apply the `full-ci` label.
fastCI :: JobGroup Job -> JobGroup Job
-fastCI = modifyValidateJobs (addJobRule FastCI)
+fastCI = modifyValidateJobs (removeValidateJobRule FullCI)
-- | Mark a group of jobs as allowed to fail.
allowFailureGroup :: JobGroup Job -> JobGroup Job
@@ -823,15 +865,19 @@ allowFailureGroup = modifyJobs allowFailure
-- | Add a 'Rule' to just the validate job, for example, only run a job if a certain
-- label is set.
-addValidateRule :: Rule -> JobGroup Job -> JobGroup Job
-addValidateRule t = modifyValidateJobs (addJobRule t)
+addValidateRule :: ValidateRule -> JobGroup Job -> JobGroup Job
+addValidateRule t = modifyValidateJobs (addValidateJobRule t)
+
+-- | Only run a validate job if a certain rule is enabled
+onlyRule :: ValidateRule -> JobGroup Job -> JobGroup Job
+onlyRule t = modifyValidateJobs (onlyValidateJobRule t)
-- | Don't run the validate job, normally used to alleviate CI load by marking
-- jobs which are unlikely to fail (ie different linux distros)
disableValidate :: JobGroup Job -> JobGroup Job
-disableValidate = addValidateRule Disable
+disableValidate st = st { v = Nothing }
-data NamedJob a = NamedJob { name :: String, jobInfo :: a } deriving Functor
+data NamedJob a = NamedJob { name :: String, jobInfo :: a } deriving (Show, Functor)
renameJob :: (String -> String) -> NamedJob a -> NamedJob a
renameJob f (NamedJob n i) = NamedJob (f n) i
@@ -841,31 +887,32 @@ instance ToJSON a => ToJSON (NamedJob a) where
[ "name" A..= name nj
, "jobInfo" A..= jobInfo nj ]
+
+--data NamedJobGroup a = NamedJobGroup { platform :: String, jg :: JobGroup a }
+
-- Jobs are grouped into either triples or pairs depending on whether the
-- job is just validate and nightly, or also release.
-data JobGroup a = StandardTriple { v :: NamedJob a
- , n :: NamedJob a
- , r :: NamedJob a }
- | ValidateOnly { v :: NamedJob a
- , n :: NamedJob a } deriving Functor
+data JobGroup a = StandardTriple { v :: Maybe (NamedJob a)
+ , n :: Maybe (NamedJob a)
+ , r :: Maybe (NamedJob a) } deriving (Functor, Show)
instance ToJSON a => ToJSON (JobGroup a) where
- toJSON jg = object
- [ "n" A..= n jg
- , "r" A..= r jg
+ toJSON StandardTriple{..} = object
+ [ "v" A..= v
+ , "n" A..= n
+ , "r" A..= r
]
rename :: (String -> String) -> JobGroup a -> JobGroup a
-rename f (StandardTriple nv nn nr) = StandardTriple (renameJob f nv) (renameJob f nn) (renameJob f nr)
-rename f (ValidateOnly nv nn) = ValidateOnly (renameJob f nv) (renameJob f nn)
+rename f (StandardTriple nv nn nr) = StandardTriple (renameJob f <$> nv) (renameJob f <$> nn) (renameJob f <$> nr)
-- | Construct a 'JobGroup' which consists of a validate, nightly and release build with
-- a specific config.
standardBuildsWithConfig :: Arch -> Opsys -> BuildConfig -> JobGroup Job
standardBuildsWithConfig a op bc =
- StandardTriple (validate a op bc)
- (nightly a op bc)
- (release a op bc)
+ StandardTriple (Just (validate a op bc))
+ (Just (nightly a op bc))
+ (Just (release a op bc))
-- | Construct a 'JobGroup' which consists of a validate, nightly and release builds with
-- the 'vanilla' config.
@@ -875,11 +922,12 @@ standardBuilds a op = standardBuildsWithConfig a op vanilla
-- | Construct a 'JobGroup' which just consists of a validate and nightly build. We don't
-- produce releases for these jobs.
validateBuilds :: Arch -> Opsys -> BuildConfig -> JobGroup Job
-validateBuilds a op bc = ValidateOnly (validate a op bc) (nightly a op bc)
+validateBuilds a op bc = StandardTriple { v = Just (validate a op bc)
+ , n = Just (nightly a op bc)
+ , r = Nothing }
flattenJobGroup :: JobGroup a -> [(String, a)]
-flattenJobGroup (StandardTriple a b c) = map flattenNamedJob [a,b,c]
-flattenJobGroup (ValidateOnly a b) = map flattenNamedJob [a, b]
+flattenJobGroup (StandardTriple a b c) = map flattenNamedJob (catMaybes [a,b,c])
flattenNamedJob :: NamedJob a -> (String, a)
flattenNamedJob (NamedJob n i) = (n, i)
@@ -887,9 +935,7 @@ flattenNamedJob (NamedJob n i) = (n, i)
-- | Specification for all the jobs we want to build.
jobs :: Map String Job
-jobs = Map.fromList $ concatMap (filter is_enabled_job . flattenJobGroup) job_groups
- where
- is_enabled_job (_, Job {jobRules = OnOffRules {..}}) = not $ Disable `S.member` rule_set
+jobs = Map.fromList $ concatMap (flattenJobGroup) job_groups
job_groups :: [JobGroup Job]
job_groups =
@@ -904,7 +950,7 @@ job_groups =
, -- Nightly allowed to fail: #22343
modifyNightlyJobs allowFailure
(modifyValidateJobs manual (validateBuilds Amd64 (Linux Debian10) noTntc))
- , addValidateRule LLVMBackend (validateBuilds Amd64 (Linux Debian10) llvm)
+ , onlyRule LLVMBackend (validateBuilds Amd64 (Linux Debian10) llvm)
, disableValidate (standardBuilds Amd64 (Linux Debian11))
-- We still build Deb9 bindists for now due to Ubuntu 18 and Linux Mint 19
-- not being at EOL until April 2023 and they still need tinfo5.
@@ -922,7 +968,7 @@ job_groups =
, fastCI (standardBuildsWithConfig Amd64 Windows vanilla)
, disableValidate (standardBuildsWithConfig Amd64 Windows nativeInt)
, standardBuilds Amd64 Darwin
- , allowFailureGroup (addValidateRule FreeBSDLabel (validateBuilds Amd64 FreeBSD13 vanilla))
+ , allowFailureGroup (onlyRule FreeBSDLabel (validateBuilds Amd64 FreeBSD13 vanilla))
, fastCI (standardBuilds AArch64 Darwin)
, fastCI (standardBuildsWithConfig AArch64 (Linux Debian10) (splitSectionsBroken vanilla))
, disableValidate (validateBuilds AArch64 (Linux Debian10) llvm)
@@ -942,9 +988,8 @@ job_groups =
make_wasm_jobs wasm_build_config {bignumBackend = Native}
, modifyValidateJobs manual $
make_wasm_jobs wasm_build_config {unregisterised = True}
- , addValidateRule NonmovingGc (standardBuildsWithConfig Amd64 (Linux Debian11) vanilla {validateNonmovingGc = True})
- , modifyNightlyJobs (addJobRule Disable) $
- addValidateRule IpeData (validateBuilds Amd64 (Linux Debian10) zstdIpe)
+ , onlyRule NonmovingGc (standardBuildsWithConfig Amd64 (Linux Debian11) vanilla {validateNonmovingGc = True})
+ , onlyRule IpeData (validateBuilds Amd64 (Linux Debian10) zstdIpe)
]
where
@@ -994,27 +1039,51 @@ mkPlatform arch opsys = archName arch <> "-" <> opsysName opsys
-- * Prefer jobs which have a corresponding release pipeline
-- * Explicitly require tie-breaking for other cases.
platform_mapping :: Map String (JobGroup BindistInfo)
-platform_mapping = Map.map go $
- Map.fromListWith combine [ (uncurry mkPlatform (jobPlatform (jobInfo $ v j)), j) | j <- filter hasReleaseBuild job_groups ]
+platform_mapping = Map.map go combined_result
where
whitelist = [ "x86_64-linux-alpine3_12-validate"
- , "x86_64-linux-deb10-validate"
, "x86_64-linux-deb11-validate"
+ , "x86_64-linux-deb10-validate+debug_info"
, "x86_64-linux-fedora33-release"
+ , "x86_64-linux-deb11-cross_aarch64-linux-gnu-validate"
, "x86_64-windows-validate"
+ , "nightly-x86_64-linux-alpine3_17-wasm-cross_wasm32-wasi-release+fully_static"
+ , "nightly-x86_64-linux-deb11-validate"
+ , "x86_64-linux-alpine3_17-wasm-cross_wasm32-wasi-release+fully_static"
+ , "nightly-aarch64-linux-deb10-validate"
+ , "nightly-x86_64-linux-alpine3_12-validate"
+ , "nightly-x86_64-linux-deb10-validate"
+ , "nightly-x86_64-linux-fedora33-release"
+ , "nightly-x86_64-windows-validate"
+ , "release-x86_64-linux-alpine3_12-release+no_split_sections"
+ , "release-x86_64-linux-deb10-release"
+ , "release-x86_64-linux-deb11-release"
+ , "release-x86_64-linux-fedora33-release"
+ , "release-x86_64-windows-release"
]
+ process sel = Map.fromListWith combine [ (uncurry mkPlatform (jobPlatform (jobInfo $ j)), j) | (sel -> Just j) <- job_groups ]
+
+ vs = process v
+ ns = process n
+ rs = process r
+
+ all_platforms = Map.keysSet vs <> Map.keysSet ns <> Map.keysSet rs
+
+ combined_result = Map.fromList [ (p, StandardTriple { v = Map.lookup p vs
+ , n = Map.lookup p ns
+ , r = Map.lookup p rs })
+ | p <- S.toList all_platforms ]
+
combine a b
- | name (v a) `elem` whitelist = a -- Explicitly selected
- | name (v b) `elem` whitelist = b
- | otherwise = error (show (name (v a)) ++ show (name (v b)))
+ | name a `elem` whitelist = a -- Explicitly selected
+ | name b `elem` whitelist = b
+ | otherwise = error (show (name a) ++ show (name b))
go = fmap (BindistInfo . unwords . fromJust . mmlookup "BIN_DIST_NAME" . jobVariables)
- hasReleaseBuild (StandardTriple{}) = True
- hasReleaseBuild (ValidateOnly{}) = False
-data BindistInfo = BindistInfo { bindistName :: String }
+data BindistInfo = BindistInfo { bindistName :: String } deriving Show
instance ToJSON BindistInfo where
toJSON (BindistInfo n) = object [ "bindistName" A..= n ]
@@ -1029,6 +1098,7 @@ main = do
("metadata":as) -> write_result as platform_mapping
_ -> error "gen_ci.hs <gitlab|metadata> [file.json]"
+write_result :: ToJSON a => [FilePath] -> a -> IO ()
write_result as obj =
(case as of
[] -> B.putStrLn
=====================================
.gitlab/generate-ci/generate-job-metadata
=====================================
@@ -1,5 +1,7 @@
#!/usr/bin/env bash
+set -e
+
out_dir="$(git rev-parse --show-toplevel)/.gitlab"
# Update job metadata for ghcup
=====================================
.gitlab/generate-ci/generate-jobs
=====================================
@@ -1,5 +1,7 @@
#!/usr/bin/env bash
+set -e
+
out_dir="$(git rev-parse --show-toplevel)/.gitlab"
tmp="$(mktemp)"
=====================================
.gitlab/jobs.yaml
=====================================
@@ -37,7 +37,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -102,7 +102,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -163,7 +163,7 @@
],
"rules": [
{
- "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "((($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/))) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -224,7 +224,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -290,7 +290,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -352,7 +352,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -414,7 +414,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -476,7 +476,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -545,7 +545,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -609,7 +609,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -674,7 +674,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -739,7 +739,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -804,7 +804,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -868,7 +868,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -932,7 +932,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -996,7 +996,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -1059,7 +1059,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -1121,7 +1121,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -1183,7 +1183,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -1246,7 +1246,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -1308,7 +1308,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -1370,7 +1370,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -1432,7 +1432,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -1494,7 +1494,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -1521,6 +1521,68 @@
"XZ_OPT": "-9"
}
},
+ "nightly-x86_64-linux-deb10-zstd-validate": {
+ "after_script": [
+ ".gitlab/ci.sh save_cache",
+ ".gitlab/ci.sh save_test_output",
+ ".gitlab/ci.sh clean",
+ "cat ci_timings"
+ ],
+ "allow_failure": false,
+ "artifacts": {
+ "expire_in": "8 weeks",
+ "paths": [
+ "ghc-x86_64-linux-deb10-zstd-validate.tar.xz",
+ "junit.xml",
+ "unexpected-test-output.tar.gz"
+ ],
+ "reports": {
+ "junit": "junit.xml"
+ },
+ "when": "always"
+ },
+ "cache": {
+ "key": "x86_64-linux-deb10-$CACHE_REV",
+ "paths": [
+ "cabal-cache",
+ "toolchain"
+ ]
+ },
+ "dependencies": [],
+ "image": "registry.gitlab.haskell.org/ghc/ci-images/x86_64-linux-deb10:$DOCKER_REV",
+ "needs": [
+ {
+ "artifacts": false,
+ "job": "hadrian-ghc-in-ghci"
+ }
+ ],
+ "rules": [
+ {
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
+ "when": "on_success"
+ }
+ ],
+ "script": [
+ "sudo chown ghc:ghc -R .",
+ ".gitlab/ci.sh setup",
+ ".gitlab/ci.sh configure",
+ ".gitlab/ci.sh build_hadrian",
+ ".gitlab/ci.sh test_hadrian"
+ ],
+ "stage": "full-build",
+ "tags": [
+ "x86_64-linux"
+ ],
+ "variables": {
+ "BIGNUM_BACKEND": "gmp",
+ "BIN_DIST_NAME": "ghc-x86_64-linux-deb10-zstd-validate",
+ "BUILD_FLAVOUR": "validate",
+ "CONFIGURE_ARGS": "--enable-ipe-data-compression",
+ "RUNTEST_ARGS": "",
+ "TEST_ENV": "x86_64-linux-deb10-zstd-validate",
+ "XZ_OPT": "-9"
+ }
+ },
"nightly-x86_64-linux-deb11-cross_aarch64-linux-gnu-validate": {
"after_script": [
".gitlab/ci.sh save_cache",
@@ -1558,7 +1620,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -1622,7 +1684,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -1687,7 +1749,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -1749,7 +1811,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -1811,7 +1873,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -1873,7 +1935,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -1937,7 +1999,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -2002,7 +2064,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -2066,7 +2128,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -2129,7 +2191,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -2191,7 +2253,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -2249,7 +2311,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -2311,7 +2373,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY)",
"when": "on_success"
}
],
@@ -2377,7 +2439,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -2444,7 +2506,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -2508,7 +2570,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -2572,7 +2634,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -2642,7 +2704,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -2708,7 +2770,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -2774,7 +2836,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -2840,7 +2902,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -2904,7 +2966,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -2968,7 +3030,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -3032,7 +3094,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -3096,7 +3158,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -3160,7 +3222,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -3224,7 +3286,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -3290,7 +3352,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -3356,7 +3418,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -3422,7 +3484,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -3486,7 +3548,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -3550,7 +3612,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -3610,7 +3672,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -3673,7 +3735,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB == \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -3740,7 +3802,7 @@
],
"rules": [
{
- "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "((($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/))) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -3808,7 +3870,7 @@
],
"rules": [
{
- "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && ($CI_MERGE_REQUEST_LABELS =~ /.*FreeBSD.*/) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*FreeBSD.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -3871,7 +3933,7 @@
],
"rules": [
{
- "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "((($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/))) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -3935,7 +3997,7 @@
],
"rules": [
{
- "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "((($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/))) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -3999,7 +4061,7 @@
"rules": [
{
"allow_failure": true,
- "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "((($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/))) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "manual"
}
],
@@ -4063,7 +4125,7 @@
"rules": [
{
"allow_failure": true,
- "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "((($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/))) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "manual"
}
],
@@ -4126,7 +4188,7 @@
],
"rules": [
{
- "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "((($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/))) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -4188,7 +4250,7 @@
"rules": [
{
"allow_failure": true,
- "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "((($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/))) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "manual"
}
],
@@ -4249,7 +4311,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -4311,7 +4373,7 @@
],
"rules": [
{
- "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "((($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/))) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -4335,7 +4397,7 @@
"TEST_ENV": "x86_64-linux-deb10-unreg-validate"
}
},
- "x86_64-linux-deb10-validate": {
+ "x86_64-linux-deb10-validate+debug_info": {
"after_script": [
".gitlab/ci.sh save_cache",
".gitlab/ci.sh save_test_output",
@@ -4346,7 +4408,7 @@
"artifacts": {
"expire_in": "2 weeks",
"paths": [
- "ghc-x86_64-linux-deb10-validate.tar.xz",
+ "ghc-x86_64-linux-deb10-validate+debug_info.tar.xz",
"junit.xml",
"unexpected-test-output.tar.gz"
],
@@ -4372,7 +4434,7 @@
],
"rules": [
{
- "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && ($CI_MERGE_REQUEST_LABELS =~ /.*IPE.*/) && (\"true\" == \"true\")",
+ "if": "((($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/))) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -4389,14 +4451,14 @@
],
"variables": {
"BIGNUM_BACKEND": "gmp",
- "BIN_DIST_NAME": "ghc-x86_64-linux-deb10-validate",
- "BUILD_FLAVOUR": "validate",
- "CONFIGURE_ARGS": "--enable-ipe-data-compression",
+ "BIN_DIST_NAME": "ghc-x86_64-linux-deb10-validate+debug_info",
+ "BUILD_FLAVOUR": "validate+debug_info",
+ "CONFIGURE_ARGS": "",
"RUNTEST_ARGS": "",
- "TEST_ENV": "x86_64-linux-deb10-validate"
+ "TEST_ENV": "x86_64-linux-deb10-validate+debug_info"
}
},
- "x86_64-linux-deb10-validate+debug_info": {
+ "x86_64-linux-deb10-validate+llvm": {
"after_script": [
".gitlab/ci.sh save_cache",
".gitlab/ci.sh save_test_output",
@@ -4407,7 +4469,7 @@
"artifacts": {
"expire_in": "2 weeks",
"paths": [
- "ghc-x86_64-linux-deb10-validate+debug_info.tar.xz",
+ "ghc-x86_64-linux-deb10-validate+llvm.tar.xz",
"junit.xml",
"unexpected-test-output.tar.gz"
],
@@ -4433,7 +4495,7 @@
],
"rules": [
{
- "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*LLVM backend.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -4450,14 +4512,14 @@
],
"variables": {
"BIGNUM_BACKEND": "gmp",
- "BIN_DIST_NAME": "ghc-x86_64-linux-deb10-validate+debug_info",
- "BUILD_FLAVOUR": "validate+debug_info",
+ "BIN_DIST_NAME": "ghc-x86_64-linux-deb10-validate+llvm",
+ "BUILD_FLAVOUR": "validate+llvm",
"CONFIGURE_ARGS": "",
"RUNTEST_ARGS": "",
- "TEST_ENV": "x86_64-linux-deb10-validate+debug_info"
+ "TEST_ENV": "x86_64-linux-deb10-validate+llvm"
}
},
- "x86_64-linux-deb10-validate+llvm": {
+ "x86_64-linux-deb10-validate+thread_sanitizer": {
"after_script": [
".gitlab/ci.sh save_cache",
".gitlab/ci.sh save_test_output",
@@ -4468,7 +4530,7 @@
"artifacts": {
"expire_in": "2 weeks",
"paths": [
- "ghc-x86_64-linux-deb10-validate+llvm.tar.xz",
+ "ghc-x86_64-linux-deb10-validate+thread_sanitizer.tar.xz",
"junit.xml",
"unexpected-test-output.tar.gz"
],
@@ -4494,8 +4556,9 @@
],
"rules": [
{
- "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && ($CI_MERGE_REQUEST_LABELS =~ /.*LLVM backend.*/) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
- "when": "on_success"
+ "allow_failure": true,
+ "if": "((($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/))) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
+ "when": "manual"
}
],
"script": [
@@ -4511,14 +4574,16 @@
],
"variables": {
"BIGNUM_BACKEND": "gmp",
- "BIN_DIST_NAME": "ghc-x86_64-linux-deb10-validate+llvm",
- "BUILD_FLAVOUR": "validate+llvm",
+ "BIN_DIST_NAME": "ghc-x86_64-linux-deb10-validate+thread_sanitizer",
+ "BUILD_FLAVOUR": "validate+thread_sanitizer",
"CONFIGURE_ARGS": "",
+ "HADRIAN_ARGS": "--docs=none",
"RUNTEST_ARGS": "",
- "TEST_ENV": "x86_64-linux-deb10-validate+llvm"
+ "TEST_ENV": "x86_64-linux-deb10-validate+thread_sanitizer",
+ "TSAN_OPTIONS": "suppressions=$CI_PROJECT_DIR/rts/.tsan-suppressions"
}
},
- "x86_64-linux-deb10-validate+thread_sanitizer": {
+ "x86_64-linux-deb10-zstd-validate": {
"after_script": [
".gitlab/ci.sh save_cache",
".gitlab/ci.sh save_test_output",
@@ -4529,7 +4594,7 @@
"artifacts": {
"expire_in": "2 weeks",
"paths": [
- "ghc-x86_64-linux-deb10-validate+thread_sanitizer.tar.xz",
+ "ghc-x86_64-linux-deb10-zstd-validate.tar.xz",
"junit.xml",
"unexpected-test-output.tar.gz"
],
@@ -4555,9 +4620,8 @@
],
"rules": [
{
- "allow_failure": true,
- "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
- "when": "manual"
+ "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*IPE.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
+ "when": "on_success"
}
],
"script": [
@@ -4573,13 +4637,11 @@
],
"variables": {
"BIGNUM_BACKEND": "gmp",
- "BIN_DIST_NAME": "ghc-x86_64-linux-deb10-validate+thread_sanitizer",
- "BUILD_FLAVOUR": "validate+thread_sanitizer",
- "CONFIGURE_ARGS": "",
- "HADRIAN_ARGS": "--docs=none",
+ "BIN_DIST_NAME": "ghc-x86_64-linux-deb10-zstd-validate",
+ "BUILD_FLAVOUR": "validate",
+ "CONFIGURE_ARGS": "--enable-ipe-data-compression",
"RUNTEST_ARGS": "",
- "TEST_ENV": "x86_64-linux-deb10-validate+thread_sanitizer",
- "TSAN_OPTIONS": "suppressions=$CI_PROJECT_DIR/rts/.tsan-suppressions"
+ "TEST_ENV": "x86_64-linux-deb10-zstd-validate"
}
},
"x86_64-linux-deb11-cross_aarch64-linux-gnu-validate": {
@@ -4619,7 +4681,7 @@
],
"rules": [
{
- "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "((($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/))) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -4682,7 +4744,7 @@
],
"rules": [
{
- "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "((($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/))) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -4746,7 +4808,7 @@
],
"rules": [
{
- "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*full-ci.*/) || ($CI_MERGE_REQUEST_LABELS =~ /.*marge_bot_batch_merge_job.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && ($CI_MERGE_REQUEST_LABELS =~ /.*non-moving GC.*/) && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(($CI_MERGE_REQUEST_LABELS =~ /.*non-moving GC.*/)) && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -4807,7 +4869,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
@@ -4866,7 +4928,7 @@
],
"rules": [
{
- "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null) && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\") && (\"true\" == \"true\")",
+ "if": "(\"true\" == \"true\") && ($RELEASE_JOB != \"yes\") && ($NIGHTLY == null)",
"when": "on_success"
}
],
=====================================
compiler/GHC/Hs/Utils.hs
=====================================
@@ -4,7 +4,7 @@
{-|
Module : GHC.Hs.Utils
Description : Generic helpers for the HsSyn type.
-Copyright : (c) The University of Glasgow, 1992-2006
+Copyright : (c) The University of Glasgow, 1992-2023
Here we collect a variety of helper functions that construct or
analyse HsSyn. All these functions deal with generic HsSyn; functions
@@ -35,8 +35,10 @@ just attach noSrcSpan to everything.
{-# LANGUAGE TypeApplications #-}
{-# LANGUAGE TypeFamilies #-}
{-# LANGUAGE ViewPatterns #-}
+{-# LANGUAGE NamedFieldPuns #-}
{-# OPTIONS_GHC -Wno-incomplete-record-updates #-}
+{-# LANGUAGE RecordWildCards #-}
module GHC.Hs.Utils(
-- * Terms
@@ -105,7 +107,9 @@ module GHC.Hs.Utils(
hsForeignDeclsBinders, hsGroupBinders, hsDataFamInstBinders,
-- * Collecting implicit binders
- lStmtsImplicits, hsValBindsImplicits, lPatImplicits
+ ImplicitFieldBinders(..),
+ lStmtsImplicits, hsValBindsImplicits, lPatImplicits,
+ lHsRecFieldsImplicits
) where
import GHC.Prelude hiding (head, init, last, tail)
@@ -151,7 +155,6 @@ import GHC.Utils.Outputable
import GHC.Utils.Panic
import Control.Arrow ( first )
-import Data.Either ( partitionEithers )
import Data.Foldable ( toList )
import Data.List ( partition )
import Data.List.NonEmpty ( nonEmpty )
@@ -1677,32 +1680,69 @@ constructor is an *occurrence* not a binding site
* *
************************************************************************
-The job of this family of functions is to run through binding sites and find the set of all Names
-that were defined "implicitly", without being explicitly written by the user.
+The job of the following family of functions is to run through binding sites and find
+the set of all Names that were defined "implicitly", without being explicitly written
+by the user.
-The main purpose is to find names introduced by record wildcards so that we can avoid
-warning the user when they don't use those names (#4404)
+Note [Collecting implicit binders]
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+We collect all the RHS Names that are implicitly introduced by record wildcards,
+so that we can:
+
+ - avoid warning the user when they don't use those names (#4404),
+ - report deprecation warnings for deprecated fields that are used (#23382).
+
+The functions that collect implicit binders return a collection of 'ImplicitFieldBinders',
+which associates each implicitly-introduced record field with the bound variables in the
+RHS of the record field pattern, e.g. in
+
+ data R = MkR { fld :: Int }
+ foo (MkR { .. }) = fld
+
+the renamer will elaborate this to
+
+ foo (MkR { fld = fld_var }) = fld_var
+
+and the implicit binders function will return
+
+ [ ImplicitFieldBinders { implFlBndr_field = fld
+ , implFlBndr_binders = [fld_var] } ]
+
+This information is then used:
-Since the addition of -Wunused-record-wildcards, this function returns a pair
-of [(SrcSpan, [Name])]. Each element of the list is one set of implicit
-binders, the first component of the tuple is the document describes the possible
-fix to the problem (by removing the ..).
+ - in the calls to GHC.Rename.Utils.checkUnusedRecordWildcard, to emit
+ a warning when a record wildcard binds no new variables (redundant record wildcard)
+ or none of the bound variables are used (unused record wildcard).
+ - in GHC.Rename.Utils.deprecateUsedRecordWildcard, to emit a warning
+ when the field is deprecated and any of the binders are used.
+
+NOTE: the implFlBndr_binders field should always be a singleton
+ (since the RHS of an implicit binding should always be a VarPat,
+ created in rnHsRecPatsAndThen.mkVarPat)
-This means there is some unfortunate coupling between this function and where it
-is used but it's only used for one specific purpose in one place so it seemed
-easier.
-}
+-- | All binders corresponding to a single implicit record field pattern.
+--
+-- See Note [Collecting implicit binders].
+data ImplicitFieldBinders
+ = ImplicitFieldBinders { implFlBndr_field :: Name
+ -- ^ The 'Name' of the record field
+ , implFlBndr_binders :: [Name]
+ -- ^ The binders of the RHS of the record field pattern
+ -- (in practice, always a singleton: see Note [Collecting implicit binders])
+ }
+
lStmtsImplicits :: [LStmtLR GhcRn (GhcPass idR) (LocatedA (body (GhcPass idR)))]
- -> [(SrcSpan, [Name])]
+ -> [(SrcSpan, [ImplicitFieldBinders])]
lStmtsImplicits = hs_lstmts
where
hs_lstmts :: [LStmtLR GhcRn (GhcPass idR) (LocatedA (body (GhcPass idR)))]
- -> [(SrcSpan, [Name])]
+ -> [(SrcSpan, [ImplicitFieldBinders])]
hs_lstmts = concatMap (hs_stmt . unLoc)
hs_stmt :: StmtLR GhcRn (GhcPass idR) (LocatedA (body (GhcPass idR)))
- -> [(SrcSpan, [Name])]
+ -> [(SrcSpan, [ImplicitFieldBinders])]
hs_stmt (BindStmt _ pat _) = lPatImplicits pat
hs_stmt (ApplicativeStmt _ args _) = concatMap do_arg args
where do_arg (_, ApplicativeArgOne { app_arg_pattern = pat }) = lPatImplicits pat
@@ -1719,19 +1759,26 @@ lStmtsImplicits = hs_lstmts
hs_local_binds (HsIPBinds {}) = []
hs_local_binds (EmptyLocalBinds _) = []
-hsValBindsImplicits :: HsValBindsLR GhcRn (GhcPass idR) -> [(SrcSpan, [Name])]
+hsValBindsImplicits :: HsValBindsLR GhcRn (GhcPass idR)
+ -> [(SrcSpan, [ImplicitFieldBinders])]
hsValBindsImplicits (XValBindsLR (NValBinds binds _))
= concatMap (lhsBindsImplicits . snd) binds
hsValBindsImplicits (ValBinds _ binds _)
= lhsBindsImplicits binds
-lhsBindsImplicits :: LHsBindsLR GhcRn idR -> [(SrcSpan, [Name])]
+lhsBindsImplicits :: LHsBindsLR GhcRn idR -> [(SrcSpan, [ImplicitFieldBinders])]
lhsBindsImplicits = foldBag (++) (lhs_bind . unLoc) []
where
lhs_bind (PatBind { pat_lhs = lpat }) = lPatImplicits lpat
lhs_bind _ = []
-lPatImplicits :: LPat GhcRn -> [(SrcSpan, [Name])]
+-- | Collect all record wild card binders in the given pattern.
+--
+-- These are all the variables bound in all (possibly nested) record wildcard patterns
+-- appearing inside the pattern.
+--
+-- See Note [Collecting implicit binders].
+lPatImplicits :: LPat GhcRn -> [(SrcSpan, [ImplicitFieldBinders])]
lPatImplicits = hs_lpat
where
hs_lpat lpat = hs_pat (unLoc lpat)
@@ -1745,28 +1792,41 @@ lPatImplicits = hs_lpat
hs_pat (ParPat _ _ pat _) = hs_lpat pat
hs_pat (ListPat _ pats) = hs_lpats pats
hs_pat (TuplePat _ pats _) = hs_lpats pats
-
hs_pat (SigPat _ pat _) = hs_lpat pat
- hs_pat (ConPat {pat_con=con, pat_args=ps}) = details con ps
+ hs_pat (ConPat {pat_args=ps}) = details ps
hs_pat _ = []
- details :: LocatedN Name -> HsConPatDetails GhcRn -> [(SrcSpan, [Name])]
- details _ (PrefixCon _ ps) = hs_lpats ps
- details n (RecCon fs) =
- [(err_loc, collectPatsBinders CollNoDictBinders implicit_pats) | Just{} <- [rec_dotdot fs] ]
- ++ hs_lpats explicit_pats
-
- where implicit_pats = map (hfbRHS . unLoc) implicit
- explicit_pats = map (hfbRHS . unLoc) explicit
-
+ details :: HsConPatDetails GhcRn -> [(SrcSpan, [ImplicitFieldBinders])]
+ details (PrefixCon _ ps) = hs_lpats ps
+ details (RecCon (HsRecFields { rec_dotdot = Nothing, rec_flds }))
+ = hs_lpats $ map (hfbRHS . unLoc) rec_flds
+ details (RecCon (HsRecFields { rec_dotdot = Just (L err_loc rec_dotdot), rec_flds }))
+ = [(err_loc, implicit_field_binders)]
+ ++ hs_lpats explicit_pats
+
+ where (explicit_pats, implicit_field_binders)
+ = rec_field_expl_impl rec_flds rec_dotdot
+
+ details (InfixCon p1 p2) = hs_lpat p1 ++ hs_lpat p2
+
+lHsRecFieldsImplicits :: [LHsRecField GhcRn (LPat GhcRn)]
+ -> RecFieldsDotDot
+ -> [ImplicitFieldBinders]
+lHsRecFieldsImplicits rec_flds rec_dotdot
+ = snd $ rec_field_expl_impl rec_flds rec_dotdot
+
+rec_field_expl_impl :: [LHsRecField GhcRn (LPat GhcRn)]
+ -> RecFieldsDotDot
+ -> ([LPat GhcRn], [ImplicitFieldBinders])
+rec_field_expl_impl rec_flds (RecFieldsDotDot { .. })
+ = ( map (hfbRHS . unLoc) explicit_binds
+ , map implicit_field_binders implicit_binds )
+ where (explicit_binds, implicit_binds) = splitAt unRecFieldsDotDot rec_flds
+ implicit_field_binders (L _ (HsFieldBind { hfbLHS = L _ fld, hfbRHS = rhs }))
+ = ImplicitFieldBinders
+ { implFlBndr_field = foExt fld
+ , implFlBndr_binders = collectPatBinders CollNoDictBinders rhs }
- (explicit, implicit) = partitionEithers [if pat_explicit then Left fld else Right fld
- | (i, fld) <- [0..] `zip` rec_flds fs
- , let pat_explicit =
- maybe True ((i<) . unRecFieldsDotDot . unLoc)
- (rec_dotdot fs)]
- err_loc = maybe (getLocA n) getLoc (rec_dotdot fs)
- details _ (InfixCon p1 p2) = hs_lpat p1 ++ hs_lpat p2
=====================================
compiler/GHC/Rename/Bind.hs
=====================================
@@ -397,7 +397,7 @@ rnLocalValBindsAndThen binds@(ValBinds _ _ sigs) thing_inside
-- Insert fake uses for variables introduced implicitly by
-- wildcards (#4404)
rec_uses = hsValBindsImplicits binds'
- implicit_uses = mkNameSet $ concatMap snd
+ implicit_uses = mkNameSet $ concatMap (concatMap implFlBndr_binders . snd)
$ rec_uses
; mapM_ (\(loc, ns) ->
checkUnusedRecordWildcard loc real_uses (Just ns))
=====================================
compiler/GHC/Rename/Env.hs
=====================================
@@ -83,7 +83,6 @@ import GHC.Types.Hint
import GHC.Types.Error
import GHC.Unit.Module
import GHC.Unit.Module.ModIface
-import GHC.Unit.Module.Warnings ( WarningTxt(..) )
import GHC.Core.ConLike
import GHC.Core.DataCon
import GHC.Core.TyCon
@@ -114,7 +113,6 @@ import Control.Monad
import Data.Either ( partitionEithers )
import Data.Function ( on )
import Data.List ( find, partition, groupBy, sortBy )
-import Data.Foldable ( for_ )
import qualified Data.List.NonEmpty as NE
import qualified Data.Semigroup as Semi
import System.IO.Unsafe ( unsafePerformIO )
@@ -182,7 +180,7 @@ deprecation warnings during renaming. At the moment, you don't get any
warning until you use the identifier further downstream. This would
require adjusting addUsedGRE so that during signature compilation,
we do not report deprecation warnings for LocalDef. See also
-Note [Handling of deprecations]
+Note [Handling of deprecations] in GHC.Rename.Utils
-}
newTopSrcBinder :: LocatedN RdrName -> RnM Name
@@ -1698,25 +1696,18 @@ lookupGreAvailRn rdr_name
* *
*********************************************************
-Note [Handling of deprecations]
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-* We report deprecations at each *occurrence* of the deprecated thing
- (see #5867 and #4879)
-
-* We do not report deprecations for locally-defined names. For a
- start, we may be exporting a deprecated thing. Also we may use a
- deprecated thing in the defn of another deprecated things. We may
- even use a deprecated thing in the defn of a non-deprecated thing,
- when changing a module's interface.
-
-* We also report deprecations at export sites, but only for names
- deprecated with export deprecations (since those are not transitive as opposed
- to regular name deprecations and are only reported at the importing module)
-
-* addUsedGREs: we do not report deprecations for sub-binders:
- - the ".." completion for records
- - the ".." in an export item 'T(..)'
- - the things exported by a module export 'module M'
+Note [Using isImportedGRE in addUsedGRE]
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+In addUsedGRE, we want to add any used imported GREs to the tcg_used_gres field,
+so that we can emit appropriate warnings (see GHC.Rename.Names.warnUnusedImportDecls).
+
+We want to do this for GREs that were brought into scope through imports. As per
+Note [GlobalRdrElt provenance] in GHC.Types.Name.Reader, this means we should
+check that gre_imp is non-empty. Checking that gre_lcl is False is INCORRECT,
+because we might have obtained the GRE by an Exact or Orig direct reference,
+in which case we have both gre_lcl = False and gre_imp = emptyBag.
+
+Geting this wrong can lead to panics in e.g. bestImport, see #23240.
-}
addUsedDataCons :: GlobalRdrEnv -> TyCon -> RnM ()
@@ -1727,21 +1718,11 @@ addUsedDataCons rdr_env tycon
| dc <- tyConDataCons tycon
, Just gre <- [lookupGRE_Name rdr_env (dataConName dc)] ]
--- | Whether to report deprecation warnings when registering a used GRE
---
--- There is no option to only emit declaration warnings since everywhere
--- we emit the declaration warnings we also emit export warnings
--- (See Note [Handling of deprecations] for details)
-data DeprecationWarnings
- = NoDeprecationWarnings
- | ExportDeprecationWarnings
- | AllDeprecationWarnings
-
addUsedGRE :: DeprecationWarnings -> GlobalRdrElt -> RnM ()
-- Called for both local and imported things
-- Add usage *and* warn if deprecated
addUsedGRE warn_if_deprec gre
- = do { condWarnIfDeprecated warn_if_deprec [gre]
+ = do { warnIfDeprecated warn_if_deprec [gre]
; when (isImportedGRE gre) $ -- See Note [Using isImportedGRE in addUsedGRE]
do { env <- getGblEnv
-- Do not report the GREInfo (#23424)
@@ -1751,9 +1732,9 @@ addUsedGRE warn_if_deprec gre
addUsedGREs :: DeprecationWarnings -> [GlobalRdrElt] -> RnM ()
-- Record uses of any *imported* GREs
-- Used for recording used sub-bndrs
--- NB: no call to warnIfDeprecated; see Note [Handling of deprecations]
+-- NB: no call to warnIfDeprecated; see Note [Handling of deprecations] in GHC.Rename.Utils
addUsedGREs warn_if_deprec gres
- = do { condWarnIfDeprecated warn_if_deprec gres
+ = do { warnIfDeprecated warn_if_deprec gres
; unless (null imp_gres) $
do { env <- getGblEnv
-- Do not report the GREInfo (#23424)
@@ -1763,85 +1744,6 @@ addUsedGREs warn_if_deprec gres
imp_gres = filter isImportedGRE gres
-- See Note [Using isImportedGRE in addUsedGRE]
-{- Note [Using isImportedGRE in addUsedGRE]
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-In addUsedGRE, we want to add any used imported GREs to the tcg_used_gres field,
-so that we can emit appropriate warnings (see GHC.Rename.Names.warnUnusedImportDecls).
-
-We want to do this for GREs that were brought into scope through imports. As per
-Note [GlobalRdrElt provenance] in GHC.Types.Name.Reader, this means we should
-check that gre_imp is non-empty. Checking that gre_lcl is False is INCORRECT,
-because we might have obtained the GRE by an Exact or Orig direct reference,
-in which case we have both gre_lcl = False and gre_imp = emptyBag.
-
-Geting this wrong can lead to panics in e.g. bestImport, see #23240.
--}
-
-condWarnIfDeprecated :: DeprecationWarnings -> [GlobalRdrElt] -> RnM ()
-condWarnIfDeprecated NoDeprecationWarnings _ = return ()
-condWarnIfDeprecated opt gres = do
- this_mod <- getModule
- let external_gres
- = filterOut (nameIsLocalOrFrom this_mod . greName) gres
- mapM_ (\gre -> warnIfExportDeprecated gre >> maybeWarnDeclDepr gre) external_gres
- where
- maybeWarnDeclDepr = case opt of
- ExportDeprecationWarnings -> const $ return ()
- AllDeprecationWarnings -> warnIfDeclDeprecated
-
-warnIfDeclDeprecated :: GlobalRdrElt -> RnM ()
-warnIfDeclDeprecated gre@(GRE { gre_imp = iss })
- | Just imp_spec <- headMaybe iss
- = do { dflags <- getDynFlags
- ; when (wopt_any_custom dflags) $
- -- See Note [Handling of deprecations]
- do { iface <- loadInterfaceForName doc name
- ; case lookupImpDeclDeprec iface gre of
- Just deprText -> addDiagnostic $
- TcRnPragmaWarning
- PragmaWarningName
- { pwarn_occname = occ
- , pwarn_impmod = importSpecModule imp_spec
- , pwarn_declmod = definedMod }
- deprText
- Nothing -> return () } }
- | otherwise
- = return ()
- where
- occ = greOccName gre
- name = greName gre
- definedMod = moduleName $ assertPpr (isExternalName name) (ppr name) (nameModule name)
- doc = text "The name" <+> quotes (ppr occ) <+> text "is mentioned explicitly"
-
-lookupImpDeclDeprec :: ModIface -> GlobalRdrElt -> Maybe (WarningTxt GhcRn)
-lookupImpDeclDeprec iface gre
- -- Bleat if the thing, or its parent, is warn'd
- = mi_decl_warn_fn (mi_final_exts iface) (greOccName gre) `mplus`
- case greParent gre of
- ParentIs p -> mi_decl_warn_fn (mi_final_exts iface) (nameOccName p)
- NoParent -> Nothing
-
-warnIfExportDeprecated :: GlobalRdrElt -> RnM ()
-warnIfExportDeprecated gre@(GRE { gre_imp = iss })
- = do { mod_warn_mbs <- mapBagM process_import_spec iss
- ; for_ (sequence mod_warn_mbs) $ mapM
- $ \(importing_mod, warn_txt) -> addDiagnostic $
- TcRnPragmaWarning
- PragmaWarningExport
- { pwarn_occname = occ
- , pwarn_impmod = importing_mod }
- warn_txt }
- where
- occ = greOccName gre
- name = greName gre
- doc = text "The name" <+> quotes (ppr occ) <+> text "is mentioned explicitly"
- process_import_spec :: ImportSpec -> RnM (Maybe (ModuleName, WarningTxt GhcRn))
- process_import_spec is = do
- let mod = is_mod $ is_decl is
- iface <- loadInterfaceForModule doc mod
- let mb_warn_txt = mi_export_warn_fn (mi_final_exts iface) name
- return $ (moduleName mod, ) <$> mb_warn_txt
-
{-
Note [Used names with interface not loaded]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
=====================================
compiler/GHC/Rename/Expr.hs
=====================================
@@ -1452,7 +1452,7 @@ rnRecStmtsAndThen ctxt rnBody s cont
; let bound_names = collectLStmtsBinders CollNoDictBinders (map fst new_lhs_and_fv)
-- Fake uses of variables introduced implicitly (warning suppression, see #4404)
rec_uses = lStmtsImplicits (map fst new_lhs_and_fv)
- implicit_uses = mkNameSet $ concatMap snd $ rec_uses
+ implicit_uses = mkNameSet $ concatMap (concatMap implFlBndr_binders . snd) $ rec_uses
; bindLocalNamesFV bound_names $
addLocalFixities fix_env bound_names $ do
=====================================
compiler/GHC/Rename/Pat.hs
=====================================
@@ -682,12 +682,15 @@ rnConPatAndThen mk con (RecCon rpats)
}
}
-checkUnusedRecordWildcardCps :: SrcSpan -> Maybe [Name] -> CpsRn ()
+checkUnusedRecordWildcardCps :: SrcSpan
+ -> Maybe [ImplicitFieldBinders]
+ -> CpsRn ()
checkUnusedRecordWildcardCps loc dotdot_names =
CpsRn (\thing -> do
(r, fvs) <- thing ()
checkUnusedRecordWildcard loc fvs dotdot_names
return (r, fvs) )
+
--------------------
rnHsRecPatsAndThen :: NameMaker
-> LocatedN Name -- Constructor
@@ -698,7 +701,7 @@ rnHsRecPatsAndThen mk (L _ con)
= do { flds <- liftCpsFV $ rnHsRecFields (HsRecFieldPat con) mkVarPat
hs_rec_fields
; flds' <- mapM rn_field (flds `zip` [1..])
- ; check_unused_wildcard (implicit_binders flds' <$> dd)
+ ; check_unused_wildcard (lHsRecFieldsImplicits flds' <$> unLoc <$> dd)
; return (HsRecFields { rec_flds = flds', rec_dotdot = dd }) }
where
mkVarPat l n = VarPat noExtField (L (noAnnSrcSpan l) n)
@@ -708,11 +711,6 @@ rnHsRecPatsAndThen mk (L _ con)
loc = maybe noSrcSpan getLoc dd
- -- Get the arguments of the implicit binders
- implicit_binders fs (unLoc -> RecFieldsDotDot n) = collectPatsBinders CollNoDictBinders implicit_pats
- where
- implicit_pats = map (hfbRHS . unLoc) (drop n fs)
-
-- Don't warn for let P{..} = ... in ...
check_unused_wildcard = case mk of
LetMk{} -> const (return ())
=====================================
compiler/GHC/Rename/Utils.hs
=====================================
@@ -1,6 +1,8 @@
{-# LANGUAGE FlexibleContexts #-}
{-# LANGUAGE TypeFamilies #-}
{-# LANGUAGE GADTs #-}
+{-# LANGUAGE RecordWildCards #-}
+{-# LANGUAGE TupleSections #-}
{-
@@ -16,6 +18,7 @@ module GHC.Rename.Utils (
warnUnusedMatches, warnUnusedTypePatterns,
warnUnusedTopBinds, warnUnusedLocalBinds,
warnForallIdentifier,
+ DeprecationWarnings(..), warnIfDeprecated,
checkUnusedRecordWildcard,
badQualBndrErr, typeAppErr, badFieldConErr,
wrapGenSpan, genHsVar, genLHsVar, genHsApp, genHsApps, genLHsApp,
@@ -56,17 +59,25 @@ import GHC.Types.SourceFile
import GHC.Types.SourceText ( SourceText(..), IntegralLit )
import GHC.Utils.Outputable
import GHC.Utils.Misc
+import GHC.Unit.Module.ModIface
+import GHC.Utils.Panic
import GHC.Types.Basic
import GHC.Data.List.SetOps ( removeDupsOn )
import GHC.Data.Maybe ( whenIsJust )
import GHC.Driver.DynFlags
import GHC.Data.FastString
+import GHC.Data.Bag ( mapBagM, headMaybe )
import Control.Monad
import GHC.Settings.Constants ( mAX_TUPLE_SIZE, mAX_CTUPLE_SIZE )
+import GHC.Unit.Module
+import GHC.Unit.Module.Warnings ( WarningTxt(..) )
+import GHC.Iface.Load
import qualified GHC.LanguageExtensions as LangExt
import qualified Data.List as List
import qualified Data.List.NonEmpty as NE
+import Data.Foldable
+import Data.Maybe
{-
@@ -375,14 +386,17 @@ warnUnusedTopBinds gres
-- -Wredundant-record-wildcards
checkUnusedRecordWildcard :: SrcSpan
-> FreeVars
- -> Maybe [Name]
+ -> Maybe [ImplicitFieldBinders]
-> RnM ()
checkUnusedRecordWildcard _ _ Nothing = return ()
-checkUnusedRecordWildcard loc _ (Just []) =
- -- Add a new warning if the .. pattern binds no variables
- setSrcSpan loc $ warnRedundantRecordWildcard
-checkUnusedRecordWildcard loc fvs (Just dotdot_names) =
- setSrcSpan loc $ warnUnusedRecordWildcard dotdot_names fvs
+checkUnusedRecordWildcard loc fvs (Just dotdot_fields_binders)
+ = setSrcSpan loc $ case concatMap implFlBndr_binders dotdot_fields_binders of
+ -- Add a new warning if the .. pattern binds no variables
+ [] -> warnRedundantRecordWildcard
+ dotdot_names
+ -> do
+ warnUnusedRecordWildcard dotdot_names fvs
+ deprecateUsedRecordWildcard dotdot_fields_binders fvs
-- | Produce a warning when the `..` pattern binds no new
@@ -415,6 +429,33 @@ warnUnusedRecordWildcard ns used_names = do
traceRn "warnUnused" (ppr ns $$ ppr used_names $$ ppr used)
warnIf (null used) (TcRnUnusedRecordWildcard ns)
+-- | Emit a deprecation message whenever one of the implicit record wild
+-- card field binders was used in FreeVars.
+--
+-- @
+-- module A where
+-- data P = P { x :: Int, y :: Int }
+-- {-# DEPRECATED x, y "depr msg" #-}
+--
+-- module B where
+-- import A
+-- foo (P{..}) = x
+-- @
+--
+-- Even though both `x` and `y` have deprecations, only `x`
+-- will be deprecated since only its implicit variable is used in the RHS.
+deprecateUsedRecordWildcard :: [ImplicitFieldBinders]
+ -> FreeVars -> RnM ()
+deprecateUsedRecordWildcard dotdot_fields_binders fvs
+ = mapM_ depr_field_binders dotdot_fields_binders
+ where
+ depr_field_binders (ImplicitFieldBinders {..})
+ = when (mkFVs implFlBndr_binders `intersectsFVs` fvs) $ do
+ env <- getGlobalRdrEnv
+ let gre = fromJust $ lookupGRE_Name env implFlBndr_field
+ -- Must be in the env since it was instantiated
+ -- in the implicit binders
+ warnIfDeprecated AllDeprecationWarnings [gre]
warnUnusedLocalBinds, warnUnusedMatches, warnUnusedTypePatterns
@@ -434,6 +475,109 @@ warnForallIdentifier (L l rdr_name@(Unqual occ))
where isKw = (occNameFS occ ==)
warnForallIdentifier _ = return ()
+{-
+************************************************************************
+* *
+\subsection{Custom deprecations utility functions}
+* *
+************************************************************************
+
+Note [Handling of deprecations]
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+* We report deprecations at each *occurrence* of the deprecated thing
+ (see #5867 and #4879)
+
+* We do not report deprecations for locally-defined names. For a
+ start, we may be exporting a deprecated thing. Also we may use a
+ deprecated thing in the defn of another deprecated things. We may
+ even use a deprecated thing in the defn of a non-deprecated thing,
+ when changing a module's interface.
+
+* We also report deprecations at export sites, but only for names
+ deprecated with export deprecations (since those are not transitive as opposed
+ to regular name deprecations and are only reported at the importing module)
+
+* addUsedGREs: we do not report deprecations for sub-binders:
+ - the ".." completion for records
+ - the ".." in an export item 'T(..)'
+ - the things exported by a module export 'module M'
+-}
+
+-- | Whether to report deprecation warnings when registering a used GRE
+--
+-- There is no option to only emit declaration warnings since everywhere
+-- we emit the declaration warnings we also emit export warnings
+-- (See Note [Handling of deprecations] for details)
+data DeprecationWarnings
+ = NoDeprecationWarnings
+ | ExportDeprecationWarnings
+ | AllDeprecationWarnings
+
+warnIfDeprecated :: DeprecationWarnings -> [GlobalRdrElt] -> RnM ()
+warnIfDeprecated NoDeprecationWarnings _ = return ()
+warnIfDeprecated opt gres = do
+ this_mod <- getModule
+ let external_gres
+ = filterOut (nameIsLocalOrFrom this_mod . greName) gres
+ mapM_ (\gre -> warnIfExportDeprecated gre >> maybeWarnDeclDepr gre) external_gres
+ where
+ maybeWarnDeclDepr = case opt of
+ ExportDeprecationWarnings -> const $ return ()
+ AllDeprecationWarnings -> warnIfDeclDeprecated
+
+warnIfDeclDeprecated :: GlobalRdrElt -> RnM ()
+warnIfDeclDeprecated gre@(GRE { gre_imp = iss })
+ | Just imp_spec <- headMaybe iss
+ = do { dflags <- getDynFlags
+ ; when (wopt_any_custom dflags) $
+ -- See Note [Handling of deprecations]
+ do { iface <- loadInterfaceForName doc name
+ ; case lookupImpDeclDeprec iface gre of
+ Just deprText -> addDiagnostic $
+ TcRnPragmaWarning
+ PragmaWarningName
+ { pwarn_occname = occ
+ , pwarn_impmod = importSpecModule imp_spec
+ , pwarn_declmod = definedMod }
+ deprText
+ Nothing -> return () } }
+ | otherwise
+ = return ()
+ where
+ occ = greOccName gre
+ name = greName gre
+ definedMod = moduleName $ assertPpr (isExternalName name) (ppr name) (nameModule name)
+ doc = text "The name" <+> quotes (ppr occ) <+> text "is mentioned explicitly"
+
+lookupImpDeclDeprec :: ModIface -> GlobalRdrElt -> Maybe (WarningTxt GhcRn)
+lookupImpDeclDeprec iface gre
+ -- Bleat if the thing, or its parent, is warn'd
+ = mi_decl_warn_fn (mi_final_exts iface) (greOccName gre) `mplus`
+ case greParent gre of
+ ParentIs p -> mi_decl_warn_fn (mi_final_exts iface) (nameOccName p)
+ NoParent -> Nothing
+
+warnIfExportDeprecated :: GlobalRdrElt -> RnM ()
+warnIfExportDeprecated gre@(GRE { gre_imp = iss })
+ = do { mod_warn_mbs <- mapBagM process_import_spec iss
+ ; for_ (sequence mod_warn_mbs) $ mapM
+ $ \(importing_mod, warn_txt) -> addDiagnostic $
+ TcRnPragmaWarning
+ PragmaWarningExport
+ { pwarn_occname = occ
+ , pwarn_impmod = importing_mod }
+ warn_txt }
+ where
+ occ = greOccName gre
+ name = greName gre
+ doc = text "The name" <+> quotes (ppr occ) <+> text "is mentioned explicitly"
+ process_import_spec :: ImportSpec -> RnM (Maybe (ModuleName, WarningTxt GhcRn))
+ process_import_spec is = do
+ let mod = is_mod $ is_decl is
+ iface <- loadInterfaceForModule doc mod
+ let mb_warn_txt = mi_export_warn_fn (mi_final_exts iface) name
+ return $ (moduleName mod, ) <$> mb_warn_txt
+
-------------------------
-- Helpers
warnUnusedGREs :: [GlobalRdrElt] -> RnM ()
=====================================
compiler/GHC/Tc/Gen/Expr.hs
=====================================
@@ -54,7 +54,7 @@ import GHC.Tc.Gen.Bind ( tcLocalBinds )
import GHC.Tc.Instance.Family ( tcGetFamInstEnvs )
import GHC.Core.FamInstEnv ( FamInstEnvs )
import GHC.Rename.Expr ( mkExpandedExpr )
-import GHC.Rename.Env ( addUsedGRE, getUpdFieldLbls, DeprecationWarnings(..) )
+import GHC.Rename.Env ( addUsedGRE, getUpdFieldLbls )
import GHC.Tc.Utils.Env
import GHC.Tc.Gen.Arrow
import GHC.Tc.Gen.Match
=====================================
compiler/GHC/Types/Name/Set.hs
=====================================
@@ -22,7 +22,7 @@ module GHC.Types.Name.Set (
-- ** Manipulating sets of free variables
isEmptyFVs, emptyFVs, plusFVs, plusFV,
mkFVs, addOneFV, unitFV, delFV, delFVs,
- intersectFVs,
+ intersectFVs, intersectsFVs,
-- * Defs and uses
Defs, Uses, DefUse, DefUses,
@@ -127,6 +127,7 @@ mkFVs :: [Name] -> FreeVars
delFV :: Name -> FreeVars -> FreeVars
delFVs :: [Name] -> FreeVars -> FreeVars
intersectFVs :: FreeVars -> FreeVars -> FreeVars
+intersectsFVs :: FreeVars -> FreeVars -> Bool
isEmptyFVs :: NameSet -> Bool
isEmptyFVs = isEmptyNameSet
@@ -139,6 +140,7 @@ unitFV = unitNameSet
delFV n s = delFromNameSet s n
delFVs ns s = delListFromNameSet s ns
intersectFVs = intersectNameSet
+intersectsFVs = intersectsNameSet
{-
************************************************************************
=====================================
configure.ac
=====================================
@@ -325,7 +325,8 @@ FP_FIND_ROOT
# Extract and configure the Windows toolchain
if test "$HostOS" = "mingw32" -a "$EnableDistroToolchain" = "NO"; then
- FP_SETUP_WINDOWS_TOOLCHAIN
+ FP_INSTALL_WINDOWS_TOOLCHAIN
+ FP_SETUP_WINDOWS_TOOLCHAIN([$hardtop/inplace/mingw], [$hardtop/inplace/mingw])
else
AC_PATH_TOOL([CC],[gcc], [clang])
AC_PATH_TOOL([CXX],[g++], [clang++])
=====================================
distrib/configure.ac.in
=====================================
@@ -103,6 +103,17 @@ AC_ARG_ENABLE(distro-toolchain,
[EnableDistroToolchain=@SettingsUseDistroMINGW@]
)
+if test "$HostOS" = "mingw32" -a "$EnableDistroToolchain" = "NO"; then
+ FP_SETUP_WINDOWS_TOOLCHAIN([$hardtop/mingw/], [\$\$topdir/../mingw/])
+fi
+
+if test "$HostOS" = "mingw32"; then
+ WindresCmd="$Windres"
+ AC_SUBST([WindresCmd])
+ AC_SUBST([GenlibCmd])
+ AC_SUBST([HAVE_GENLIB])
+fi
+
dnl ** Which gcc to use?
dnl --------------------------------------------------------------
AC_PROG_CC([gcc clang])
@@ -288,6 +299,7 @@ if test "x$UseLibdw" = "xYES" ; then
fi
AC_SUBST(UseLibdw)
+
FP_SETTINGS
AC_CONFIG_FILES([config.mk])
=====================================
hadrian/bindist/Makefile
=====================================
@@ -63,19 +63,28 @@ show:
.PHONY: install
ifeq "$(TargetOS_CPP)" "mingw32"
-install_bin: install_mingw install_bin_direct
+install_extra: install_mingw
+else
+install_extra:
+endif
+
+ifeq "$(RelocatableBuild)" "YES"
+install_bin: install_bin_direct
else
install_bin: install_bin_libdir install_wrappers
endif
-install: install_bin install_lib
+
+
+install: install_bin install_lib install_extra
install: install_man install_docs update_package_db
-ActualBinsDir=${ghclibdir}/bin
ifeq "$(RelocatableBuild)" "YES"
ActualLibsDir=${ghclibdir}
+ActualBinsDir=${bindir}
else
ActualLibsDir=${ghclibdir}/lib
+ActualBinsDir=${ghclibdir}/bin
endif
WrapperBinsDir=${bindir}
@@ -201,7 +210,7 @@ install_docs:
fi
MAN_SECTION := 1
-MAN_PAGES := doc/users_guide/build-man/ghc.1
+MAN_PAGES := manpage/ghc.1
.PHONY: install_man
install_man:
=====================================
hadrian/bindist/config.mk.in
=====================================
@@ -63,6 +63,13 @@ $(eval $(call set_default,dvidir,$${docdir}))
$(eval $(call set_default,pdfdir,$${docdir}))
$(eval $(call set_default,psdir,$${docdir}))
+# On Windows we normally want to make a relocatable bindist, to we
+# ignore flags like libdir
+ifeq "$(Windows_Host)" "YES"
+RelocatableBuild = YES
+DYNAMIC_GHC_PROGRAMS = NO
+endif
+
ifeq "$(RelocatableBuild)" "YES"
# Hack: our directory layouts tend to be different on Windows, so
@@ -149,13 +156,6 @@ else
GhcWithInterpreter=$(if $(findstring YES,$(DYNAMIC_GHC_PROGRAMS)),YES,NO)
endif
-# On Windows we normally want to make a relocatable bindist, to we
-# ignore flags like libdir
-ifeq "$(Windows_Host)" "YES"
-RelocatableBuild = YES
-else
-RelocatableBuild = NO
-endif
# runhaskell and hsc2hs are special, in that other compilers besides
=====================================
hadrian/src/Builder.hs
=====================================
@@ -362,7 +362,7 @@ instance H.Builder Builder where
HsCpp -> captureStdout
- Make dir -> cmd' path ["-C", dir] buildArgs
+ Make dir -> cmd' buildOptions path ["-C", dir] buildArgs
Makeinfo -> do
cmd' [path] "--no-split" [ "-o", output] [input]
=====================================
hadrian/src/Rules/BinaryDist.hs
=====================================
@@ -108,20 +108,35 @@ other, the install script:
-}
+installTo :: String -> Action ()
+installTo prefix = do
+ root <- buildRoot
+ version <- setting ProjectVersion
+ targetPlatform <- setting TargetPlatformFull
+ let ghcVersionPretty = "ghc-" ++ version ++ "-" ++ targetPlatform
+ bindistFilesDir = root -/- "bindist" -/- ghcVersionPretty
+ runBuilder (Configure bindistFilesDir) ["--prefix="++prefix] [] []
+ runBuilderWithCmdOptions [AddEnv "RelocatableBuild" "YES"] (Make bindistFilesDir) ["install"] [] []
+
bindistRules :: Rules ()
bindistRules = do
root <- buildRootRules
- phony "install" $ do
+ phony "reloc-binary-dist-dir" $ do
need ["binary-dist-dir"]
+ cwd <- liftIO $ IO.getCurrentDirectory
version <- setting ProjectVersion
targetPlatform <- setting TargetPlatformFull
let ghcVersionPretty = "ghc-" ++ version ++ "-" ++ targetPlatform
- bindistFilesDir = root -/- "bindist" -/- ghcVersionPretty
- prefixErr = "You must specify a path with --prefix when using the"
+ let prefix = cwd -/- root -/- "reloc-bindist" -/- ghcVersionPretty
+ installTo prefix
+
+
+ phony "install" $ do
+ need ["binary-dist-dir"]
+ let prefixErr = "You must specify a path with --prefix when using the"
++ " 'install' rule"
installPrefix <- fromMaybe (error prefixErr) <$> cmdPrefix
- runBuilder (Configure bindistFilesDir) ["--prefix="++installPrefix] [] []
- runBuilder (Make bindistFilesDir) ["install"] [] []
+ installTo installPrefix
phony "binary-dist-dir" $ do
-- We 'need' all binaries and libraries
@@ -207,16 +222,6 @@ bindistRules = do
cmd_ (bindistFilesDir -/- "bin" -/- ghcPkgName) ["recache"]
- -- The settings file must be regenerated by the bindist installation
- -- logic to account for the environment discovered by the bindist
- -- configure script on the host. Not on Windows, however, where
- -- we do not ship a configure script with the bindist. See #20254.
- --
- -- N.B. we must do this after ghc-pkg has been run as it will go
- -- looking for the settings files.
- unless windowsHost $
- removeFile (bindistFilesDir -/- "lib" -/- "settings")
-
unless cross $ need ["docs"]
-- TODO: we should only embed the docs that have been generated
@@ -246,41 +251,44 @@ bindistRules = do
-- reference. See #20802.
copyDirectory ("utils" -/- "completion") bindistFilesDir
- -- These scripts are only necessary in the configure/install
- -- workflow which is not supported on windows.
- -- TODO: Instead of guarding against windows, we could offer the
- -- option to make a relocatable, but not installable bindist on any
- -- platform.
- unless windowsHost $ do
- -- We then 'need' all the files necessary to configure and install
- -- (as in, './configure [...] && make install') this build on some
- -- other machine.
- need $ map (bindistFilesDir -/-)
- (["configure", "Makefile"] ++ bindistInstallFiles)
- copyFile ("hadrian" -/- "bindist" -/- "config.mk.in") (bindistFilesDir -/- "config.mk.in")
- copyFile ("hadrian" -/- "cfg" -/- "default.target.in") (bindistFilesDir -/- "default.target.in")
- copyFile ("hadrian" -/- "cfg" -/- "default.host.target.in") (bindistFilesDir -/- "default.host.target.in")
- forM_ bin_targets $ \(pkg, _) -> do
- needed_wrappers <- pkgToWrappers pkg
- forM_ needed_wrappers $ \wrapper_name -> do
- let suffix = if useGhcPrefix pkg
- then "ghc-" ++ version
- else version
- wrapper_content <- wrapper wrapper_name
- let unversioned_wrapper_path = bindistFilesDir -/- "wrappers" -/- wrapper_name
- versioned_wrapper = wrapper_name ++ "-" ++ suffix
- versioned_wrapper_path = bindistFilesDir -/- "wrappers" -/- versioned_wrapper
- -- Write the wrapper to the versioned path
- writeFile' versioned_wrapper_path wrapper_content
- -- Create a symlink from the non-versioned to the versioned.
- liftIO $ do
- IO.removeFile unversioned_wrapper_path <|> return ()
- IO.createFileLink versioned_wrapper unversioned_wrapper_path
-
-
- let buildBinDist :: Compressor -> Action ()
- buildBinDist compressor = do
- need ["binary-dist-dir"]
+ -- Copy the manpage into the binary distribution
+ whenM (liftIO (IO.doesDirectoryExist (root -/- "manpage"))) $ do
+ copyDirectory (root -/- "manpage") bindistFilesDir
+
+ -- We then 'need' all the files necessary to configure and install
+ -- (as in, './configure [...] && make install') this build on some
+ -- other machine.
+ need $ map (bindistFilesDir -/-)
+ (["configure", "Makefile"] ++ bindistInstallFiles)
+ copyFile ("hadrian" -/- "bindist" -/- "config.mk.in") (bindistFilesDir -/- "config.mk.in")
+ copyFile ("hadrian" -/- "cfg" -/- "default.target.in") (bindistFilesDir -/- "default.target.in")
+ copyFile ("hadrian" -/- "cfg" -/- "default.host.target.in") (bindistFilesDir -/- "default.host.target.in")
+
+ -- todo: do we need these wrappers on windows
+ forM_ bin_targets $ \(pkg, _) -> do
+ needed_wrappers <- pkgToWrappers pkg
+ forM_ needed_wrappers $ \wrapper_name -> do
+ let suffix = if useGhcPrefix pkg
+ then "ghc-" ++ version
+ else version
+ wrapper_content <- wrapper wrapper_name
+ let unversioned_wrapper_path = bindistFilesDir -/- "wrappers" -/- wrapper_name
+ versioned_wrapper = wrapper_name ++ "-" ++ suffix
+ versioned_wrapper_path = bindistFilesDir -/- "wrappers" -/- versioned_wrapper
+ -- Write the wrapper to the versioned path
+ writeFile' versioned_wrapper_path wrapper_content
+ -- Create a symlink from the non-versioned to the versioned.
+ liftIO $ do
+ IO.removeFile unversioned_wrapper_path <|> return ()
+ IO.createFileLink versioned_wrapper unversioned_wrapper_path
+
+
+ let buildBinDist = buildBinDistX "binary-dist-dir" "bindist"
+ buildBinDistReloc = buildBinDistX "reloc-binary-dist-dir" "reloc-bindist"
+
+ buildBinDistX :: String -> FilePath -> Compressor -> Action ()
+ buildBinDistX target bindist_folder compressor = do
+ need [target]
version <- setting ProjectVersion
targetPlatform <- setting TargetPlatformFull
@@ -289,15 +297,16 @@ bindistRules = do
-- Finally, we create the archive <root>/bindist/ghc-X.Y.Z-platform.tar.xz
tarPath <- builderPath (Tar Create)
- cmd [Cwd $ root -/- "bindist"] tarPath
+ cmd [Cwd $ root -/- bindist_folder] tarPath
[ "-c", compressorTarFlag compressor, "-f"
, ghcVersionPretty <.> "tar" <.> compressorExtension compressor
, ghcVersionPretty ]
- phony "binary-dist" $ buildBinDist Xz
- phony "binary-dist-gzip" $ buildBinDist Gzip
- phony "binary-dist-bzip2" $ buildBinDist Bzip2
- phony "binary-dist-xz" $ buildBinDist Xz
+ forM_ [("binary", buildBinDist), ("reloc-binary", buildBinDistReloc)] $ \(name, mk_bindist) -> do
+ phony (name <> "-dist") $ mk_bindist Xz
+ phony (name <> "-dist-gzip") $ mk_bindist Gzip
+ phony (name <> "-dist-bzip2") $ mk_bindist Bzip2
+ phony (name <> "-dist-xz") $ mk_bindist Xz
-- Prepare binary distribution configure script
-- (generated under <ghc root>/distrib/configure by 'autoreconf')
=====================================
hadrian/src/Rules/Documentation.hs
=====================================
@@ -43,7 +43,7 @@ archiveRoot :: FilePath
archiveRoot = docRoot -/- "archives"
manPageBuildPath :: FilePath
-manPageBuildPath = docRoot -/- "users_guide/build-man/ghc.1"
+manPageBuildPath = "manpage" -/- "ghc.1"
-- TODO: Get rid of this hack.
docContext :: Context
=====================================
libraries/base/changelog.md
=====================================
@@ -41,6 +41,7 @@
* Make `Semigroup`'s `stimes` specializable. ([CLC proposal #8](https://github.com/haskell/core-libraries-committee/issues/8))
* Deprecate `Data.List.NonEmpty.unzip` ([CLC proposal #86](https://github.com/haskell/core-libraries-committee/issues/86))
* Fixed exponent overflow/underflow bugs in the `Read` instances for `Float` and `Double` ([CLC proposal #192](https://github.com/haskell/core-libraries-committee/issues/192))
+ * Implement `copyBytes`, `fillBytes`, `moveBytes` and `stimes` for `Data.Array.Byte.ByteArray` using primops ([CLC proposal #188](https://github.com/haskell/core-libraries-committee/issues/188))
## 4.18.0.0 *March 2023*
* Shipped with GHC 9.6.1
=====================================
libraries/ghc-prim/ghc-prim.cabal
=====================================
@@ -0,0 +1,108 @@
+cabal-version: 2.2
+name: ghc-prim
+version: 0.10.0
+-- NOTE: Don't forget to update ./changelog.md
+license: BSD-3-Clause
+license-file: LICENSE
+category: GHC
+maintainer: libraries at haskell.org
+bug-reports: https://gitlab.haskell.org/ghc/ghc/issues/new
+synopsis: GHC primitives
+build-type: Custom
+description:
+ This package contains the primitive types and operations supplied by GHC.
+
+ It is an internal package, only for the use of GHC developers.
+ GHC users should not use it! If you do use it then expect
+ breaking changes at any time without warning. You should prefer
+ to import @GHC.Exts@ from the @base@ package instead.
+
+extra-source-files: changelog.md
+
+source-repository head
+ type: git
+ location: https://gitlab.haskell.org/ghc/ghc.git
+ subdir: libraries/ghc-prim
+
+custom-setup
+ setup-depends: base >= 4 && < 5, process, filepath, directory, Cabal >= 1.23 && < 3.9
+
+flag need-atomic
+ default: False
+
+Library
+ default-language: Haskell2010
+ other-extensions:
+ BangPatterns
+ CPP
+ DeriveGeneric
+ MagicHash
+ MultiParamTypeClasses
+ NoImplicitPrelude
+ StandaloneDeriving
+ Trustworthy
+ TypeFamilies
+ UnboxedTuples
+ UnliftedFFITypes
+
+ build-depends: rts == 1.0.*
+
+ exposed-modules:
+ GHC.CString
+ GHC.Classes
+ GHC.Debug
+ GHC.Magic
+ GHC.Magic.Dict
+ GHC.Prim.Ext
+ GHC.Prim.Panic
+ GHC.Prim.Exception
+ GHC.Prim.PtrEq
+ GHC.PrimopWrappers
+ GHC.Tuple
+ GHC.Tuple.Prim
+ GHC.Types
+
+ virtual-modules:
+ GHC.Prim
+
+ -- OS Specific
+ if os(windows)
+ -- Windows requires some extra libraries for linking because the RTS
+ -- is no longer re-exporting them (see #11223)
+ -- ucrt: standard C library. The RTS will automatically include this,
+ -- but is added for completeness.
+ -- mingwex: provides GNU POSIX extensions that aren't provided by ucrt.
+ -- mingw32: Unfortunately required because of a resource leak between
+ -- mingwex and mingw32. the __math_err symbol is defined in
+ -- mingw32 which is required by mingwex.
+ -- user32: provides access to apis to modify user components (UI etc)
+ -- on Windows. Required because of mingw32.
+ extra-libraries: user32, mingw32, mingwex, ucrt
+
+ if os(linux)
+ -- we need libm, but for musl and other's we might need libc, as libm
+ -- is just an empty shell.
+ extra-libraries: c, m
+
+ if flag(need-atomic)
+ -- for 64-bit atomic ops on armel (#20549)
+ extra-libraries: atomic
+
+ if !os(ghcjs)
+ c-sources:
+ cbits/atomic.c
+ cbits/bswap.c
+ cbits/bitrev.c
+ cbits/clz.c
+ cbits/ctz.c
+ cbits/debug.c
+ cbits/longlong.c
+ cbits/mulIntMayOflo.c
+ cbits/pdep.c
+ cbits/pext.c
+ cbits/popcnt.c
+ cbits/word2float.c
+
+ -- We need to set the unit ID to ghc-prim (without a version number)
+ -- as it's magic.
+ ghc-options: -this-unit-id ghc-prim
=====================================
m4/fp_settings.m4
=====================================
@@ -43,7 +43,7 @@ dnl ghc-toolchain.
AC_DEFUN([SUBST_TOOLDIR],
[
dnl and Note [How we configure the bundled windows toolchain]
- $1=`echo $$1 | sed 's%'"$mingwpath"'%$$tooldir/mingw%'`
+ $1=`echo "$$1" | sed 's%'"$mingw_prefix"'%'"$mingw_install_prefix"'%g'`
])
# FP_SETTINGS
=====================================
m4/fp_setup_windows_toolchain.m4
=====================================
@@ -1,4 +1,5 @@
-AC_DEFUN([FP_SETUP_WINDOWS_TOOLCHAIN],[
+# Download and install the windows toolchain
+AC_DEFUN([FP_INSTALL_WINDOWS_TOOLCHAIN],[
# Find the mingw-w64 archive file to extract.
if test "$HostArch" = "i386"
then
@@ -72,18 +73,29 @@ AC_DEFUN([FP_SETUP_WINDOWS_TOOLCHAIN],[
# NB. Download and extract the MingW-w64 distribution if required
set_up_tarballs
+])
+
+# Set up the environment variables
+# The actual location of the windows toolchain (before install)
+# $2 the location that the windows toolchain will be installed in relative to the libdir
+AC_DEFUN([FP_SETUP_WINDOWS_TOOLCHAIN],[
+
# N.B. The parameters which get plopped in the `settings` file used by the
# resulting compiler are computed in `FP_SETTINGS`. Specifically, we use
# $$topdir-relative paths instead of fullpaths to the toolchain, by replacing
# occurrences of $hardtop/inplace/mingw with $$tooldir/mingw
+ mingw_prefix="$1"
+ mingw_install_prefix="$2"
+# mingwpath="$hardtop/inplace/mingw"
+
# Our Windows toolchain is based around Clang and LLD. We use compiler-rt
# for the runtime, libc++ and libc++abi for the C++ standard library
# implementation, and libunwind for C++ unwinding.
- mingwbin="$hardtop/inplace/mingw/bin/"
- mingwlib="$hardtop/inplace/mingw/lib"
- mingwinclude="$hardtop/inplace/mingw/include"
- mingwpath="$hardtop/inplace/mingw"
+ mingwbin="$mingw_prefix/bin/"
+ mingwlib="$mingw_prefix/lib"
+ mingwinclude="$mingw_prefix/include"
+ mingw_mingw32_lib="$mingw_prefix/x86_64-w64-mingw32/lib"
CC="${mingwbin}clang.exe"
CXX="${mingwbin}clang++.exe"
@@ -106,8 +118,8 @@ AC_DEFUN([FP_SETUP_WINDOWS_TOOLCHAIN],[
HaskellCPPArgs="$HaskellCPPArgs -I$mingwinclude"
- CONF_GCC_LINKER_OPTS_STAGE1="-fuse-ld=lld $cflags -L$mingwlib -L$hardtop/inplace/mingw/x86_64-w64-mingw32/lib"
- CONF_GCC_LINKER_OPTS_STAGE2="-fuse-ld=lld $cflags -L$mingwlib -L$hardtop/inplace/mingw/x86_64-w64-mingw32/lib"
+ CONF_GCC_LINKER_OPTS_STAGE1="-fuse-ld=lld $cflags -L$mingwlib -L$mingw_mingw32_lib"
+ CONF_GCC_LINKER_OPTS_STAGE2="-fuse-ld=lld $cflags -L$mingwlib -L$mingw_mingw32_lib"
# N.BOn Windows we can't easily dynamically-link against libc++ since there is
# no RPATH support, meaning that the loader will have no way of finding our
=====================================
testsuite/tests/rename/should_compile/RecordWildCardDeprecation.hs
=====================================
@@ -0,0 +1,10 @@
+{-# LANGUAGE RecordWildCards #-}
+module RecordWildCardDeprecation where
+
+import RecordWildCardDeprecation_aux
+
+f (Foo { .. }) = let a = x in a
+
+g (Foo { .. }) = let a = y in a
+
+h (Foo { .. }) = let a = z in a
\ No newline at end of file
=====================================
testsuite/tests/rename/should_compile/RecordWildCardDeprecation.stderr
=====================================
@@ -0,0 +1,12 @@
+[1 of 2] Compiling RecordWildCardDeprecation_aux ( RecordWildCardDeprecation_aux.hs, RecordWildCardDeprecation_aux.o )
+[2 of 2] Compiling RecordWildCardDeprecation ( RecordWildCardDeprecation.hs, RecordWildCardDeprecation.o )
+
+RecordWildCardDeprecation.hs:6:10: warning: [GHC-68441] [-Wdeprecations (in -Wextended-warnings)]
+ In the use of record field of Foo ‘x’
+ (imported from RecordWildCardDeprecation_aux):
+ Deprecated: "name depr"
+
+RecordWildCardDeprecation.hs:8:10: warning: [GHC-68441] [-Wdeprecations (in -Wextended-warnings)]
+ In the use of record field of Foo ‘y’
+ (imported from RecordWildCardDeprecation_aux):
+ Deprecated: "export depr"
=====================================
testsuite/tests/rename/should_compile/RecordWildCardDeprecation_aux.hs
=====================================
@@ -0,0 +1,5 @@
+module RecordWildCardDeprecation_aux(Foo(Foo, x, z), {-# DEPRECATED "export depr" #-} Foo(y)) where
+
+data Foo = Foo { x :: Int, y :: Bool, z :: Char }
+
+{-# DEPRECATED x "name depr" #-}
\ No newline at end of file
=====================================
testsuite/tests/rename/should_compile/all.T
=====================================
@@ -222,3 +222,4 @@ test('ExportWarnings4', extra_files(['ExportWarnings_base.hs', 'ExportWarnings_a
test('ExportWarnings5', extra_files(['ExportWarnings_base.hs', 'ExportWarnings_aux.hs']), multimod_compile, ['ExportWarnings5', '-v0 -Wno-duplicate-exports -Wx-custom'])
test('ExportWarnings6', normal, compile, ['-Wincomplete-export-warnings'])
test('T22478a', req_th, compile, [''])
+test('RecordWildCardDeprecation', normal, multimod_compile, ['RecordWildCardDeprecation', '-Wno-duplicate-exports'])
View it on GitLab: https://gitlab.haskell.org/ghc/ghc/-/compare/23b384638a411df798a37c4c400262f68be4c6b6...f9a22fd34c2172885993695d6602b5f174e1aaa6
--
View it on GitLab: https://gitlab.haskell.org/ghc/ghc/-/compare/23b384638a411df798a37c4c400262f68be4c6b6...f9a22fd34c2172885993695d6602b5f174e1aaa6
You're receiving this email because of your account on gitlab.haskell.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.haskell.org/pipermail/ghc-commits/attachments/20230816/4dbbc57a/attachment-0001.html>
More information about the ghc-commits
mailing list