From takenobu.hs at gmail.com Sat Sep 1 08:52:38 2018 From: takenobu.hs at gmail.com (Takenobu Tani) Date: Sat, 1 Sep 2018 17:52:38 +0900 Subject: [Haskell-cafe] What is the best way to search information on Haskell Cafe? In-Reply-To: References: Message-ID: Hi, If you frequently use the "site:" option, you can also use this support web :) https://takenobu-hs.github.io/haskell-wiki-search/?siteview=full Regards, Takenobu 2018年9月1日(土) 2:00 Rodrigo Stevaux : > I am new to mailing lists. Like really new. > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From omeragacan at gmail.com Sat Sep 1 11:31:32 2018 From: omeragacan at gmail.com (=?UTF-8?Q?=C3=96mer_Sinan_A=C4=9Facan?=) Date: Sat, 1 Sep 2018 14:31:32 +0300 Subject: [Haskell-cafe] Testing of GHC extensions & optimizations In-Reply-To: References: Message-ID: Hi, Here are a few things we do regarding compiler/runtime performance: - Each commit goes through some set of tests, some of which also check max. residency, total allocations etc. of the compiler or the compiled program, and fail if those numbers are more than the allowed amount. See [1] for an example. - There's https://perf.haskell.org/ghc/ which does some testing on every commit. I don't know what exactly it's doing (hard to tell from the web page, but I guess it's only running a few select tests/benchmarks?). I've personally never used it, I just know that it exists. - Most of the time if a patch is expected to change compiler or runtime performance the author submits nofib results and updates the perf tests in the test suite for new numbers. This process is manual and sometimes contributors are asked for nofib numbers by reviewers etc. See [2,3] for nofib. We currently don't use random testing. [1]: https://github.com/ghc/ghc/blob/565ef4cc036905f9f9801c1e775236bb007b026c/testsuite/tests/perf/compiler/all.T#L30 [2]: https://github.com/ghc/nofib [3]: https://ghc.haskell.org/trac/ghc/wiki/Building/RunningNoFib Ömer Rodrigo Stevaux , 31 Ağu 2018 Cum, 20:54 tarihinde şunu yazdı: > > Hi, > > For those familiar with GHC source code & internals, how are extensions & optimizations tested? And what are the quality policies for accepting new code into GHC? > > I am interested in testing compilers in general using random testing. Is it used on GHC? > > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. From a.pelenitsyn at gmail.com Sat Sep 1 12:34:15 2018 From: a.pelenitsyn at gmail.com (Artem Pelenitsyn) Date: Sat, 1 Sep 2018 08:34:15 -0400 Subject: [Haskell-cafe] What is the best way to search information on Haskell Cafe? In-Reply-To: References: Message-ID: Hi Takenobu, It seems that icqbrowse-links do not track recent updates to the channels since April 11th 2018. Otherwise, nice resource, thanks! -- Best wishes, Artem On Sat, 1 Sep 2018, 04:53 Takenobu Tani, wrote: > Hi, > > If you frequently use the "site:" option, you can also use this support > web :) > > https://takenobu-hs.github.io/haskell-wiki-search/?siteview=full > > Regards, > Takenobu > > > 2018年9月1日(土) 2:00 Rodrigo Stevaux : > >> I am new to mailing lists. Like really new. >> _______________________________________________ >> Haskell-Cafe mailing list >> To (un)subscribe, modify options or view archives go to: >> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe >> Only members subscribed via the mailman list are allowed to post. > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From takenobu.hs at gmail.com Sun Sep 2 02:19:11 2018 From: takenobu.hs at gmail.com (Takenobu Tani) Date: Sun, 2 Sep 2018 11:19:11 +0900 Subject: [Haskell-cafe] What is the best way to search information on Haskell Cafe? In-Reply-To: References: Message-ID: Thanks :) Regards, Takenobu On Sat, Sep 1, 2018 at 9:34 PM Artem Pelenitsyn wrote: > Hi Takenobu, > > It seems that icqbrowse-links do not track recent updates to the channels > since April 11th 2018. Otherwise, nice resource, thanks! > > -- > Best wishes, > Artem > > On Sat, 1 Sep 2018, 04:53 Takenobu Tani, wrote: > >> Hi, >> >> If you frequently use the "site:" option, you can also use this support >> web :) >> >> https://takenobu-hs.github.io/haskell-wiki-search/?siteview=full >> >> Regards, >> Takenobu >> >> >> 2018年9月1日(土) 2:00 Rodrigo Stevaux : >> >>> I am new to mailing lists. Like really new. >>> _______________________________________________ >>> Haskell-Cafe mailing list >>> To (un)subscribe, modify options or view archives go to: >>> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe >>> Only members subscribed via the mailman list are allowed to post. >> >> _______________________________________________ >> Haskell-Cafe mailing list >> To (un)subscribe, modify options or view archives go to: >> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe >> Only members subscribed via the mailman list are allowed to post. > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From roehst at gmail.com Sun Sep 2 18:05:04 2018 From: roehst at gmail.com (Rodrigo Stevaux) Date: Sun, 2 Sep 2018 15:05:04 -0300 Subject: [Haskell-cafe] Testing of GHC extensions & optimizations In-Reply-To: References: Message-ID: Hi Omer, thanks for the reply. The tests you run are for regression testing, that is, non-functional aspects, is my understanding right? What about testing that optimizations and extensions are correct from a functional aspect? Em sáb, 1 de set de 2018 às 08:32, Ömer Sinan Ağacan escreveu: > Hi, > > Here are a few things we do regarding compiler/runtime performance: > > - Each commit goes through some set of tests, some of which also check max. > residency, total allocations etc. of the compiler or the compiled > program, > and fail if those numbers are more than the allowed amount. See [1] for > an > example. > > - There's https://perf.haskell.org/ghc/ which does some testing on every > commit. I don't know what exactly it's doing (hard to tell from the web > page, > but I guess it's only running a few select tests/benchmarks?). I've > personally never used it, I just know that it exists. > > - Most of the time if a patch is expected to change compiler or runtime > performance the author submits nofib results and updates the perf tests > in the > test suite for new numbers. This process is manual and sometimes > contributors > are asked for nofib numbers by reviewers etc. See [2,3] for nofib. > > We currently don't use random testing. > > [1]: > https://github.com/ghc/ghc/blob/565ef4cc036905f9f9801c1e775236bb007b026c/testsuite/tests/perf/compiler/all.T#L30 > [2]: https://github.com/ghc/nofib > [3]: https://ghc.haskell.org/trac/ghc/wiki/Building/RunningNoFib > > Ömer > > Rodrigo Stevaux , 31 Ağu 2018 Cum, 20:54 tarihinde şunu > yazdı: > > > > Hi, > > > > For those familiar with GHC source code & internals, how are extensions > & optimizations tested? And what are the quality policies for accepting new > code into GHC? > > > > I am interested in testing compilers in general using random testing. Is > it used on GHC? > > > > > > _______________________________________________ > > Haskell-Cafe mailing list > > To (un)subscribe, modify options or view archives go to: > > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > > Only members subscribed via the mailman list are allowed to post. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From svenpanne at gmail.com Sun Sep 2 19:58:42 2018 From: svenpanne at gmail.com (Sven Panne) Date: Sun, 2 Sep 2018 21:58:42 +0200 Subject: [Haskell-cafe] Testing of GHC extensions & optimizations In-Reply-To: References: Message-ID: Am So., 2. Sep. 2018 um 20:05 Uhr schrieb Rodrigo Stevaux : > Hi Omer, thanks for the reply. The tests you run are for regression > testing, that is, non-functional aspects, is my understanding right? [...] > Quite the opposite, the usual steps are: * A bug is reported. * A regression test is added to GHC's test suite, reproducing the bug ( https://ghc.haskell.org/trac/ghc/wiki/Building/RunningTests/Adding). * The bug is fixed. This way it is made sure that the bug doesn't come back later. Do this for a few decades, and you have a very comprehensive test suite for functional aspects. :-) The reasoning behind this: Blindly adding tests is wasted effort most of time, because this way you often test things which only very rarely break: Bugs OTOH hint you very concretely at problematic/tricky/complicated parts of your SW. Catching increases in runtime/memory consumption is a slightly different story, because you have to come up with "typical" scenarios to make useful comparisons. You can have synthetic scenarios for very specific parts of the compiler, too, like pattern matching with tons of constructors, or using gigantic literals, or type checking deeply nested tricky things, etc., but I am not sure if such things are usually called "regression tests". Cheers, S. -------------- next part -------------- An HTML attachment was scrubbed... URL: From olf at aatal-apotheke.de Sun Sep 2 20:05:56 2018 From: olf at aatal-apotheke.de (Olaf Klinke) Date: Sun, 2 Sep 2018 22:05:56 +0200 Subject: [Haskell-cafe] Alternative instance for non-backtracking parsers Message-ID: <458E5DB2-2E06-4291-86BC-BC471BD316CB@aatal-apotheke.de> Thanks Bardur for the pointer to bytesting-lexing and cassava. The problem is that my csv files are a little bit idiosyncratic. That is, they have BOM, semicolons as separators, The fractional numbers are in the format 1.234,567 and there are dates to parse, too. I tried parseTimeM from the time package, but that is slower than my own parser. That said, my megaparsec parser seems to spend quite some time skipping over text with regex ';"[^"]*"', that is, fields whose content does not concern the application at hand. Hence using a CSV library for tokenizing might be a good idea. Thanks to Peter Simons for pointing out that cassava indeed has attoparsec as a dependency. Maybe CSV is a red herring, after all. It's just a some text-based syntax where the count of semicolons indicate where in the line I can find the data I'm interested in. I'm more curious about how to make number conversion fast. I've looked at the source of megaparsec-6.5.0, attoparsec-0.13.2.2, bytestring-lexing-0.5.0.2 and base-4.11.1.0. They all do the same thing: Convert digits to numbers individually, then fold the list of digits as follows: f x digit = x * 10 + value digit number = foldl' f 0 digits For 'value' above, Megaparsec uses Data.Char.digitToInt while Attoparsec uses Data.Char.ord. I also rolled my own Double parser for locale reasons. Are there any libraries that handle all the formats 1,234.567 1234.567 1.234,567 1234,567 maybe by specifying a locale up front? It can't be done without, since 123.456 on its own is ambigous. Maybe the locale can be guessed from the context, maybe not, but that is certainly an expensive operation. MS Excel guesses eagerly, with occasional amusing consequences. Thanks to all who contributed to this thread so far! Olaf From jo at durchholz.org Sun Sep 2 20:43:51 2018 From: jo at durchholz.org (Joachim Durchholz) Date: Sun, 2 Sep 2018 22:43:51 +0200 Subject: [Haskell-cafe] Testing of GHC extensions & optimizations In-Reply-To: References: Message-ID: <38072d3b-5a90-46d8-8793-425adeb94898@durchholz.org> Am 02.09.2018 um 21:58 schrieb Sven Panne: > Quite the opposite, the usual steps are: > >    * A bug is reported. >    * A regression test is added to GHC's test suite, reproducing the > bug (https://ghc.haskell.org/trac/ghc/wiki/Building/RunningTests/Adding). >    * The bug is fixed. > > This way it is made sure that the bug doesn't come back later. That's just the... non-thinking aspect, and more embarrassment avoidance. The first level of automated testing. > Do this > for a few decades, and you have a very comprehensive test suite for > functional aspects. :-) The reasoning behind this: Blindly adding tests > is wasted effort most of time, because this way you often test things > which only very rarely break: Bugs OTOH hint you very concretely at > problematic/tricky/complicated parts of your SW. Well, you have to *think*. You can't just blindly add tests for every bug that was ever reported; you get an every-growing pile of test code, and if the spec changes you need to change the tests. So you need a strategy to curate the test code, and you very much prefer to test for the thing that actually went wrong, not the thing that was reported. I'm pretty sure the GHC guys do, actually; I'm just speaking up so that people don't take this "just add a test whenever a bug occurs" at face value, there's much more to it. > Catching increases in runtime/memory consumption is a slightly different > story, because you have to come up with "typical" scenarios to make > useful comparisons. It's just a case where you cannot blindly add a test for every performance regression you see, you have to set up testing beforehand. Which is the exact opposite of what you recommend, so maybe the recommendation shouldn't be taken at face value ;-P > You can have synthetic scenarios for very specific > parts of the compiler, too, like pattern matching with tons of > constructors, or using gigantic literals, or type checking deeply nested > tricky things, etc., but I am not sure if such things are usually called > "regression tests". It's a matter of definition and common usage, but indeed many people associate the term "regression testing" with "let's write a test case whenever we see a bug". This is one of the reasons why I prefer the term "automated testing". It's both more general and encompasses all the things that one does. Oh, and sometimes you even add a test blindly due to a bug report. It's still a good first line of defense, it's just not what you should always do, and never without thinking about an alternative. Regards, Jo From roehst at gmail.com Mon Sep 3 01:40:19 2018 From: roehst at gmail.com (Rodrigo Stevaux) Date: Sun, 2 Sep 2018 22:40:19 -0300 Subject: [Haskell-cafe] Testing of GHC extensions & optimizations In-Reply-To: References: Message-ID: Thanks for the clarification. What I am hinting at is, the Csmith project caught many bugs in C compilers by using random testing -- feeding random programs and testing if the optimizations preserved program behavior. Haskell, having tens of optimizations, could be a potential application of the same technique. I have no familiarity with the GHC or with any compilers in general; I am just looking for something to study. My questions in its most direct form is, as in your view, could GHC optimizations hide bugs that could be potentially be revealed by exploring program spaces? Em dom, 2 de set de 2018 às 16:58, Sven Panne escreveu: > Am So., 2. Sep. 2018 um 20:05 Uhr schrieb Rodrigo Stevaux < > roehst at gmail.com>: > >> Hi Omer, thanks for the reply. The tests you run are for regression >> testing, that is, non-functional aspects, is my understanding right? [...] >> > > Quite the opposite, the usual steps are: > > * A bug is reported. > * A regression test is added to GHC's test suite, reproducing the bug ( > https://ghc.haskell.org/trac/ghc/wiki/Building/RunningTests/Adding). > * The bug is fixed. > > This way it is made sure that the bug doesn't come back later. Do this for > a few decades, and you have a very comprehensive test suite for functional > aspects. :-) The reasoning behind this: Blindly adding tests is wasted > effort most of time, because this way you often test things which only very > rarely break: Bugs OTOH hint you very concretely at > problematic/tricky/complicated parts of your SW. > > Catching increases in runtime/memory consumption is a slightly different > story, because you have to come up with "typical" scenarios to make useful > comparisons. You can have synthetic scenarios for very specific parts of > the compiler, too, like pattern matching with tons of constructors, or > using gigantic literals, or type checking deeply nested tricky things, > etc., but I am not sure if such things are usually called "regression > tests". > > Cheers, > S. > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From svenpanne at gmail.com Mon Sep 3 06:29:54 2018 From: svenpanne at gmail.com (Sven Panne) Date: Mon, 3 Sep 2018 08:29:54 +0200 Subject: [Haskell-cafe] Testing of GHC extensions & optimizations In-Reply-To: <38072d3b-5a90-46d8-8793-425adeb94898@durchholz.org> References: <38072d3b-5a90-46d8-8793-425adeb94898@durchholz.org> Message-ID: Am So., 2. Sep. 2018 um 22:44 Uhr schrieb Joachim Durchholz < jo at durchholz.org>: > That's just the... non-thinking aspect, and more embarrassment > avoidance. The first level of automated testing. > Well, even avoiding embarrassing bugs is extremely valuable. The vast amount of bugs in real-world SW *is* actually highly embarrassing, and even worse: Similar bugs have probably been introduced before. Getting some tricky algorithm wrong is the exception, at least for two reasons: The majority of code is typically very mundane and boring, and people are usually more awake and concentrated when they know that they are writing non-trivial stuff. Of course your mileage varies, depending on the domain, experience of programmers, deadline pressure, etc. > > Do this > > for a few decades, and you have a very comprehensive test suite for > > functional aspects. :-) The reasoning behind this: Blindly adding tests > > is wasted effort most of time, because this way you often test things > > which only very rarely break: Bugs OTOH hint you very concretely at > > problematic/tricky/complicated parts of your SW. > > Well, you have to *think*. > You can't just blindly add tests for every bug that was ever reported; > you get an every-growing pile of test code, and if the spec changes you > need to change the tests. So you need a strategy to curate the test > code, and you very much prefer to test for the thing that actually went > wrong, not the thing that was reported. > Two things here: I never proposed to add the exact code from the bug report to a test suite. Bug reports are ususally too big and too unspecific, so of course you add a minimal, focused test triggering the buggy behavior. Furthermore: If the spec changes, your tests *must* break, by all means, otherwise: What are the tests actually testing if it's not the spec? Of course only those tests should break which test the changed part of the spec. > It's just a case where you cannot blindly add a test for every > performance regression you see, you have to set up testing beforehand. > Which is the exact opposite of what you recommend, so maybe the > recommendation shouldn't be taken at face value ;-P > This is exactly why I said that these tests are a different story. For performance measurements there is no binary "failed" or "correct" outcome, because typically many tradeoffs are involved (space vs. time etc.). Therefore you have to define what you consider important, measure that, and guard it against regressions. It's a matter of definition and common usage, but indeed many people > associate the term "regression testing" with "let's write a test case > whenever we see a bug". [...] > This sounds far too disparaging, and a quite a few companies have a rule like "no bug fix gets committed without an accompanying regression test" for a good reason. People usually have no real clue where their most problematic code is (just like they have no clue where the most performance-critical part is), so having *some* hint (bug report) is far better than guessing without any hint. Cheers, S. -------------- next part -------------- An HTML attachment was scrubbed... URL: From 78emil at gmail.com Mon Sep 3 07:08:12 2018 From: 78emil at gmail.com (Emil Axelsson) Date: Mon, 3 Sep 2018 09:08:12 +0200 Subject: [Haskell-cafe] Testing of GHC extensions & optimizations In-Reply-To: References: Message-ID: Have a look at Michal Palka's Ph.D. thesis: https://research.chalmers.se/publication/195849 IIRC, his testing revealed several strictness bugs in GHC when compiling with optimization. / Emil Den 2018-09-03 kl. 03:40, skrev Rodrigo Stevaux: > Thanks for the clarification. > > What I am hinting at is, the Csmith project caught many bugs in C > compilers by using random testing -- feeding random programs and > testing if the optimizations preserved program behavior. > > Haskell, having tens of optimizations, could be a potential > application of the same technique. > > I have no familiarity with the GHC or with any compilers in general; I > am just looking for something to study. > > My questions in its most direct form is, as in your view, could GHC > optimizations hide bugs that could be potentially be revealed by > exploring program spaces? > > Em dom, 2 de set de 2018 às 16:58, Sven Panne > escreveu: > > Am So., 2. Sep. 2018 um 20:05 Uhr schrieb Rodrigo Stevaux > >: > > Hi Omer, thanks for the reply. The tests you run are for > regression testing, that is, non-functional aspects, is my > understanding right? [...] > > > Quite the opposite, the usual steps are: > >    * A bug is reported. >    * A regression test is added to GHC's test suite, reproducing > the bug > (https://ghc.haskell.org/trac/ghc/wiki/Building/RunningTests/Adding). >    * The bug is fixed. > > This way it is made sure that the bug doesn't come back later. Do > this for a few decades, and you have a very comprehensive test > suite for functional aspects. :-) The reasoning behind this: > Blindly adding tests is wasted effort most of time, because this > way you often test things which only very rarely break: Bugs OTOH > hint you very concretely at problematic/tricky/complicated parts > of your SW. > > Catching increases in runtime/memory consumption is a slightly > different story, because you have to come up with "typical" > scenarios to make useful comparisons. You can have synthetic > scenarios for very specific parts of the compiler, too, like > pattern matching with tons of constructors, or using gigantic > literals, or type checking deeply nested tricky things, etc., but > I am not sure if such things are usually called "regression tests". > > Cheers, >    S. > > > > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. From vlatko.basic at gmail.com Mon Sep 3 08:59:11 2018 From: vlatko.basic at gmail.com (Vlatko Basic) Date: Mon, 3 Sep 2018 10:59:11 +0200 Subject: [Haskell-cafe] Applying typed hole proposal leads to compilation failure In-Reply-To: <20180824213656.jguyjyejpftfnjgo@mniip.com> References: <38e8c44b-7a1e-ca30-76bf-cf4a063cbda0@gmail.com> <20180824213656.jguyjyejpftfnjgo@mniip.com> Message-ID: <065a7baf-4d6e-eda5-79a0-419b6d1bc872@gmail.com> Hi mniip, Let me first apologise for my very late response. I went for a visit to the analog world, and stayed much longer than planned. :-) I have ScopedTypeVariables enabled as a default extension in .cabal file, but have never encountered such an error, to have to manually specify forall just for making scoped types to work. I'm using local signatures quite often, but still not quite clear as to how/where the original code differs, for example, from this one (which compiles fine): mkTransUnitValTag :: (HasGlobals s) => InNode -> MS c s TransUnitValT mkTransUnitValTag e@(Element "tuv" as (cleanBlank -> cs) _) = do   TransUnitValT <$> attrGlobDef e glbDataType "datatype"   as -- tuvDataType                  ...                 <*> parseTag    e "seg" mkSegTag           cs -- tuvSeg   where     mkSegTag :: InNode -> MS c s Content     mkSegTag (Element "seg" _as ss _) = checkContent =<< mapM mkContentTag ss Is the main diff that 'run' is having monad stack as input and is running it, while 'mkSegTag' is run in it (so forall does not have to be specified manually)?     mkSegTag :: InNode -> MS c s Content     f1 :: forall m c. (MonadIO m) => c -> m ()  -- original code        where run :: MS c Int a -> (Either String a, Int) Thanks for pointing me to read the whole error/warning. Everything is actually written there, but seems I have developed some kind of forall blindness. :-( On 24/08/2018 23:36, mniip wrote: >> • Found type wildcard ‘_c’ standing for ‘c’ >>    Where: ‘c’ is a rigid type variable bound by >>             the type signature for: >>               f1 :: forall (m :: * -> *) c. MonadIO m => c -> m Bool >>             at Test.hs:15:1-32 > Emphasis on "rigid". It's not telling you to introduce a new type > variable and put that there. It's telling you that the type you need to > put there is an existing type variable's type. > > When you write 'run :: MS c Int a -> (Either String a, Int)' you > implicitly mean 'run :: forall c.' which is exactly introducing a new > type variable. > >> • Couldn't match type ‘c1’ with ‘c’ >>    ‘c1’ is a rigid type variable bound by >>      the type signature for: >>        run :: forall c1 a. MS c1 Int a -> (Either String a, Int) > This is the 'c' you bound with the implicit 'forall'. The compiler is > asked to verify that 'run' indeed works 'forall c1', so during > typechecking of the function body the 'c1' variable is also rigid. > >>    ‘c’ is a rigid type variable bound by >>      the type signature for: >>        f1 :: forall (m :: * -> *) c. MonadIO m => c -> m Bool > This is the 'c' from the typed hole suggestion up above, still rigid. > > A part of the typechecking algorithm is that two rigid type variables > cannot be equated. > > The solution *actually* proposed by GHC in the wildcard suggestion is to > use the 'c' variable from 'f1's type for which you need to make it > scoped with an explicit 'forall': > > f1 :: forall c. (MonadIO m) => c -> m () > f1 c = do >   let _x1 = run f2 >   let _x2 = run f3 >   return () >   where >     run :: MS c Int a -> (Either String a, Int) >     run = runMS c 0 >     f2 :: MS c s Bool >     f2 = pure False >     f3 :: MS c s [Int] >     f3 = pure [] > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From roehst at gmail.com Mon Sep 3 13:11:51 2018 From: roehst at gmail.com (Rodrigo Stevaux) Date: Mon, 3 Sep 2018 10:11:51 -0300 Subject: [Haskell-cafe] Testing of GHC extensions & optimizations In-Reply-To: References: Message-ID: Ok this is the kind of stuff im looking for. This is great. Many thanks for the insight. Em seg, 3 de set de 2018 às 04:08, Emil Axelsson <78emil at gmail.com> escreveu: > Have a look at Michal Palka's Ph.D. thesis: > > https://research.chalmers.se/publication/195849 > > IIRC, his testing revealed several strictness bugs in GHC when compiling > with optimization. > > / Emil > > Den 2018-09-03 kl. 03:40, skrev Rodrigo Stevaux: > > Thanks for the clarification. > > > > What I am hinting at is, the Csmith project caught many bugs in C > > compilers by using random testing -- feeding random programs and > > testing if the optimizations preserved program behavior. > > > > Haskell, having tens of optimizations, could be a potential > > application of the same technique. > > > > I have no familiarity with the GHC or with any compilers in general; I > > am just looking for something to study. > > > > My questions in its most direct form is, as in your view, could GHC > > optimizations hide bugs that could be potentially be revealed by > > exploring program spaces? > > > > Em dom, 2 de set de 2018 às 16:58, Sven Panne > > escreveu: > > > > Am So., 2. Sep. 2018 um 20:05 Uhr schrieb Rodrigo Stevaux > > >: > > > > Hi Omer, thanks for the reply. The tests you run are for > > regression testing, that is, non-functional aspects, is my > > understanding right? [...] > > > > > > Quite the opposite, the usual steps are: > > > > * A bug is reported. > > * A regression test is added to GHC's test suite, reproducing > > the bug > > (https://ghc.haskell.org/trac/ghc/wiki/Building/RunningTests/Adding > ). > > * The bug is fixed. > > > > This way it is made sure that the bug doesn't come back later. Do > > this for a few decades, and you have a very comprehensive test > > suite for functional aspects. :-) The reasoning behind this: > > Blindly adding tests is wasted effort most of time, because this > > way you often test things which only very rarely break: Bugs OTOH > > hint you very concretely at problematic/tricky/complicated parts > > of your SW. > > > > Catching increases in runtime/memory consumption is a slightly > > different story, because you have to come up with "typical" > > scenarios to make useful comparisons. You can have synthetic > > scenarios for very specific parts of the compiler, too, like > > pattern matching with tons of constructors, or using gigantic > > literals, or type checking deeply nested tricky things, etc., but > > I am not sure if such things are usually called "regression tests". > > > > Cheers, > > S. > > > > > > > > > > _______________________________________________ > > Haskell-Cafe mailing list > > To (un)subscribe, modify options or view archives go to: > > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > > Only members subscribed via the mailman list are allowed to post. > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From will.yager at gmail.com Mon Sep 3 15:29:35 2018 From: will.yager at gmail.com (Will Yager) Date: Mon, 3 Sep 2018 08:29:35 -0700 Subject: [Haskell-cafe] Alternative instance for non-backtracking parsers In-Reply-To: <51D62184-BA55-4789-A467-A042ABDF722B@aatal-apotheke.de> References: <51D62184-BA55-4789-A467-A042ABDF722B@aatal-apotheke.de> Message-ID: <4A857511-AA99-427C-864F-51E4523FCCE4@gmail.com> On Aug 30, 2018, at 11:21, Olaf Klinke wrote: > > [*] To the parser experts on this list: How much time should a parser take that processes a 50MB, 130000-line text file, extracting 5 values (String, UTCTime, Int, Double) from each line? > _______________________________________________ > The combination of attoparsec + a streaming adapter for pipes/conduit/streaming should easily be able to handle tens of megabytes per second and hundreds of thousands of lines per second. For an example, check out https://github.com/wyager/Callsigns/blob/master/Callsigns.hs Which parses a pipe-separated-value file from the FCC pretty quickly. As I recall it goes through a >100MB file in under three seconds, and it has to do a bunch of other work besides. I also ported the above code to use Streaming instead of Pipes. I recall that using Streaming master, the parser I use to read the dictionary: takeTill isEndOfLine <* endOfLine Handles about 3 million lines per second. I can’t remember what the number is for Pipes but it’s probably similar. That’s really good for such a simple thing to write! Unfortunately there is a performance bug in Streaming that’s fixed in master but hasn’t been released for a number of months :-/ —Will -------------- next part -------------- An HTML attachment was scrubbed... URL: From olf at aatal-apotheke.de Mon Sep 3 18:33:06 2018 From: olf at aatal-apotheke.de (Olaf Klinke) Date: Mon, 3 Sep 2018 20:33:06 +0200 Subject: [Haskell-cafe] Alternative instance for non-backtracking parsers In-Reply-To: <4A857511-AA99-427C-864F-51E4523FCCE4@gmail.com> References: <51D62184-BA55-4789-A467-A042ABDF722B@aatal-apotheke.de> <4A857511-AA99-427C-864F-51E4523FCCE4@gmail.com> Message-ID: <43055B3C-2AE9-49D5-B528-1B64BC6EB65B@aatal-apotheke.de> > Am 03.09.2018 um 17:29 schrieb Will Yager : > > > > On Aug 30, 2018, at 11:21, Olaf Klinke wrote: > >> >> [*] To the parser experts on this list: How much time should a parser take that processes a 50MB, 130000-line text file, extracting 5 values (String, UTCTime, Int, Double) from each line? >> _______________________________________________ >> > > The combination of attoparsec + a streaming adapter for pipes/conduit/streaming should easily be able to handle tens of megabytes per second and hundreds of thousands of lines per second. That's good to know, so there is plenty of room for improvement left. > > For an example, check out https://github.com/wyager/Callsigns/blob/master/Callsigns.hs > > Which parses a pipe-separated-value file from the FCC pretty quickly. As I recall it goes through a >100MB file in under three seconds, and it has to do a bunch of other work besides. The parser does nothing except chunk up the line's text and replace parts of it by constants. I'm surprised and pleased though that HashMaps have such good performance. Profiling shows that my parser now spends most time converting to numbers and dates. I wrote a primitive skipSep :: Int -> Char -> Parser () which skips over input until it has read a given number of the given character. This cut down overall execution time from 12s to 6s, meaning the parsing time is down by more than 50%. Seems the combinators like *> and >>= and 'manyTill' do have a non-neglegible cost, so combining from fewer parts makes the parser faster. Would other parser users agree? Olaf From byorgey at gmail.com Tue Sep 4 02:34:48 2018 From: byorgey at gmail.com (Brent Yorgey) Date: Mon, 3 Sep 2018 21:34:48 -0500 Subject: [Haskell-cafe] Diagrams In-Reply-To: <47b2dd405b68423a907c8dfa285faeb5@mun.ca> References: <47b2dd405b68423a907c8dfa285faeb5@mun.ca> Message-ID: Hi Roger, Development on diagrams is rather slow at the moment, but it is still actively used and maintained. The best places to get help are (1) on the #diagrams IRC channel on Freenode (you probably won't get an immediate response, but any messages will definitely be seen and will get responded to eventually if you stick around), (2) the mailing list as others pointed out. We try to be a helpful bunch and would love to help you get diagrams working for you. -Brent On Tue, Aug 28, 2018 at 1:39 PM Roger Mason wrote: > Hello, > > Does anyone know the current status of the Diagrams package? > I'm having some trouble and wonder if anyone can help, given > the last release was in 2016. > > Thanks, > Roger > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From daniel.rolls.27 at googlemail.com Tue Sep 4 11:21:55 2018 From: daniel.rolls.27 at googlemail.com (Daniel Rolls) Date: Tue, 4 Sep 2018 12:21:55 +0100 Subject: [Haskell-cafe] Building stack projects as docker images Message-ID: I am building Haskell web services using docker. I do this utilising the stack tool within the official Haskell Docker image. Doing this has a number of advantages - It's predictable - I can control bringing in new versions of stack and ghc. - It's easy to share - no stack or Haskell installation is required to build the image - These services can interact with other services in a controlled environment using Docker compose. One downside is that I doing this naively means waiting for a modules to download every time docker build is run since the stack build command will start from scratch each time. I worked around this by adding two build steps. The first runs a hello world example with the stack.yml and package.yml from my project. The second build builds the intended code. This trick separated the downloading of dependent libraries from building the code and means day to day code changes are quick to build since the first stack build is cached as a docker image layer. Still though, making a small change to a stack yml file means a long wait re-downloading all dependent libraries. Firstly, is there a better way of doing this? I know stack supports building within docker but when I worked this way I still found projects depended on the system stack and would commonly fail due to stack bugs. Secondly, my two "stack build" calls where the first builds a hello world program feels a bit hacky. is there a better way to achieve this? Thanks, Dan -------------- next part -------------- An HTML attachment was scrubbed... URL: From r.soeldner at gmail.com Tue Sep 4 11:54:53 2018 From: r.soeldner at gmail.com (Robert Soeldner) Date: Tue, 4 Sep 2018 13:54:53 +0200 Subject: [Haskell-cafe] HDBC packages looking for maintainer In-Reply-To: <20180830132402.afxitjdepdsuydoj@nibbler> References: <20180830132402.afxitjdepdsuydoj@nibbler> Message-ID: Hey Tobias, sorry for the late response, just saw the reply from Erik - I missed your message... We would be really happy to see a PR or a new issue. Robert Am Do., 30. Aug. 2018 um 15:24 Uhr schrieb Tobias Dammers < tdammers at gmail.com>: > Hi, > > I'd be interested. I've used HDBC on a few projects, and my yeshql > library was originally built with HDBC as the only backend. It would be > a terrible shame to see this bitrot. > > Cheers, > > Tobias (tdammers on github etc.) > > On Mon, Aug 13, 2018 at 12:07:38PM +0200, Erik Hesselink wrote: > > Hi all, > > > > I've been the maintainer for some of the HDBC packages for a while now. > > Sadly, I've mostly neglected them due to lack of time and usage. While > the > > packages mostly work, there are occasional pull requests and updates for > > new compiler versions. > > > > Because of this I'm looking for someone who wants to take over HDBC and > > related packages [1]. If you use HDBC and would like to take over > > maintainership, please let me know and we can get things set up. > > > > Regards, > > > > Erik > > > > [1] https://github.com/hdbc > > > _______________________________________________ > > Haskell-Cafe mailing list > > To (un)subscribe, modify options or view archives go to: > > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > > Only members subscribed via the mailman list are allowed to post. > > > -- > Tobias Dammers - tdammers at gmail.com > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From rmason at mun.ca Tue Sep 4 13:32:25 2018 From: rmason at mun.ca (Roger Mason) Date: Tue, 04 Sep 2018 11:02:25 -0230 Subject: [Haskell-cafe] Diagrams In-Reply-To: References: <47b2dd405b68423a907c8dfa285faeb5@mun.ca> Message-ID: <85389c911b64630b8ca0ef7f351c0a20@mun.ca> Hello Brent and others, On 2018-09-04 00:04, Brent Yorgey wrote: > Hi Roger, > > Development on diagrams is rather slow at the moment, but it is still > actively used and maintained. The best places to get help are (1) on > the #diagrams IRC channel on Freenode (you probably won't get an > immediate response, but any messages will definitely be seen and will > get responded to eventually if you stick around), (2) the mailing list > as others pointed out. We try to be a helpful bunch and would love to > help you get diagrams working for you. > > -Brent I am traveling and so have intermittent connectivity, so my responses have been rather slow. After some tinkering I have made progress in the last couple of days. If I get stuck again (I am a haskell novice) I will certainly ask on #diagrams or the mailing list. Thanks again to all who responded. Roger From johannes.waldmann at htwk-leipzig.de Tue Sep 4 16:36:01 2018 From: johannes.waldmann at htwk-leipzig.de (Johannes Waldmann) Date: Tue, 4 Sep 2018 18:36:01 +0200 Subject: [Haskell-cafe] small example for space-efficiency via lazy evaluation? Message-ID: <7af4c401-4716-e5e0-1ee3-3be95ea0ed3e@htwk-leipzig.de> Dear Cafe, I wanted to demonstrate that main = print $ sum $ map (^ 2) $ [ 1 :: Int .. 10^8 ] without any optimisations, still runs in constant space because garbage is collected immediately. Except that it does not: ghc -O0 space.hs -rtsopts -ddump-simpl ./space +RTS -M80k -A10k gives me unoptimized Core as expected, but exhausts the heap. After some experimentation, I found that replacing Prelude.sum with Data.Foldable.foldl' (+) 0 achieves what I want. I think the reason is that instance Foldable [] implements sum by foldl (non-strict). Both versions will run without any allocation as soon as we compile with -O1 . So, my question is, what do you use as a (teaching) example for space-efficiency via lazy evaluation? Preferrably a one-liner, using only standard libraries, and such that the effect is not rendered moot by -O2. - J PS: It is magic that foldl and foldl' produce identical core here? $wgo_s5we (w_s5w8 :: GHC.Prim.Int#) (ww1_s5wc :: GHC.Prim.Int#) = case GHC.Prim.==# w_s5w8 ww_s5w5 of { __DEFAULT -> jump $wgo_s5we (GHC.Prim.+# w_s5w8 1#) (GHC.Prim.+# ww1_s5wc (GHC.Prim.*# w_s5w8 w_s5w8)); From travis at anduril.com Tue Sep 4 16:38:38 2018 From: travis at anduril.com (Travis Whitaker) Date: Tue, 4 Sep 2018 09:38:38 -0700 Subject: [Haskell-cafe] [Haskell, FP] Anduril Industries is Hiring Message-ID: Anduril Industries (https://www.anduril.com/careers) is hiring. Come write Haskell, Nix, and a bit of C++ to solve problems in detection, tracking, hardware interfaces, sensor fusion, and computer vision. Here are just a few of the things we're hacking on in Haskell: - Nix workflow tools for cross-compilation, CI, deployment, and upgrades over heterogeneous, unreliable infrastructure. - Radar signal processing (with CUDA via *Accelerate*), target detection, and real-time visualization. - sUAS controls and mission planning. - Comprehensively tested (i.e. thoroughly QuickCheck'd/SmallCheck'd) implementations of industrial hardware interface protocols like CAN, CANopen, MAVLink, etc., as well as internally developed protocols. - High-reliability systems for performing health checks and over-the-air firmware upgrades of embedded systems deployed in remote environments. - A library of low-latency, high-throughput video processing components, used for performing image stabilization, object detection, and transcoding in real time on streaming video. - TUI debugging tools built with *brick*. We're looking for junior and senior devs who are able to relocate to our lab in Orange County, California, USA. If this sounds interesting, shoot me an email at travis at anduril.com Thanks for reading! Travis Whitaker -------------- next part -------------- An HTML attachment was scrubbed... URL: From johannes.waldmann at htwk-leipzig.de Tue Sep 4 17:00:05 2018 From: johannes.waldmann at htwk-leipzig.de (Johannes Waldmann) Date: Tue, 4 Sep 2018 19:00:05 +0200 Subject: [Haskell-cafe] how to Enumerate without lists? Message-ID: <9936a92f-08a6-1403-1153-a8db779d0cf8@htwk-leipzig.de> Dear Cafe (again), I was trying to write sum $ map (^ 2) $ [ 1 :: Int .. 10^8 ] in a list-free style getSum $ foldMap (Sum . (^ 2)) $ [ 1 :: Int .. 10^8 ] This avoids building an intermediate list (of squares) but it will allocate, since foldMap uses foldr (by default, and it's not overridden for lists) The conclusion would be: not to use lists at all. Which will be the point of my talk anyway. But here, we get a list from enumFromTo. Can we avoid that? Let's try: we just reify enumFromTo data Enumerator a = Enumerator {from :: a, to :: a} we have this for decades, it is called (a,a) in Data.Ix, and then instance Foldable Enumerator where ... Oh no, foldMap (and others) would need an Enum constraint! - J From david.feuer at gmail.com Tue Sep 4 17:17:11 2018 From: david.feuer at gmail.com (David Feuer) Date: Tue, 4 Sep 2018 13:17:11 -0400 Subject: [Haskell-cafe] how to Enumerate without lists? In-Reply-To: <9936a92f-08a6-1403-1153-a8db779d0cf8@htwk-leipzig.de> References: <9936a92f-08a6-1403-1153-a8db779d0cf8@htwk-leipzig.de> Message-ID: I don't really understand your purpose. There are many ways to write code that GHC is good at optimizing, but there are far fewer ways to write code that will compile to non-allocating loops without optimization. Heck, without the worker-wrapper transformation demand analysis enables, you can't even get a non-allocating counter without unboxing by hand and using primops for arithmetic. On Tue, Sep 4, 2018, 1:00 PM Johannes Waldmann < johannes.waldmann at htwk-leipzig.de> wrote: > Dear Cafe (again), > > > I was trying to write > > sum $ map (^ 2) $ [ 1 :: Int .. 10^8 ] > > in a list-free style > > getSum $ foldMap (Sum . (^ 2)) $ [ 1 :: Int .. 10^8 ] > > This avoids building an intermediate list (of squares) > but it will allocate, since foldMap uses foldr > (by default, and it's not overridden for lists) > > The conclusion would be: not to use lists at all. > Which will be the point of my talk anyway. > > > But here, we get a list from enumFromTo. Can we avoid that? > > Let's try: we just reify enumFromTo > > data Enumerator a = Enumerator {from :: a, to :: a} > > we have this for decades, it is called (a,a) in Data.Ix, > and then > > instance Foldable Enumerator where ... > > Oh no, foldMap (and others) would need an Enum constraint! > > > - J > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. On Sep 4, 2018 1:00 PM, "Johannes Waldmann" < johannes.waldmann at htwk-leipzig.de> wrote: Dear Cafe (again), I was trying to write sum $ map (^ 2) $ [ 1 :: Int .. 10^8 ] in a list-free style getSum $ foldMap (Sum . (^ 2)) $ [ 1 :: Int .. 10^8 ] This avoids building an intermediate list (of squares) but it will allocate, since foldMap uses foldr (by default, and it's not overridden for lists) The conclusion would be: not to use lists at all. Which will be the point of my talk anyway. But here, we get a list from enumFromTo. Can we avoid that? Let's try: we just reify enumFromTo data Enumerator a = Enumerator {from :: a, to :: a} we have this for decades, it is called (a,a) in Data.Ix, and then instance Foldable Enumerator where ... Oh no, foldMap (and others) would need an Enum constraint! - J _______________________________________________ Haskell-Cafe mailing list To (un)subscribe, modify options or view archives go to: http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From tanuki at gmail.com Tue Sep 4 17:29:25 2018 From: tanuki at gmail.com (Theodore Lief Gannon) Date: Tue, 4 Sep 2018 10:29:25 -0700 Subject: [Haskell-cafe] Building stack projects as docker images In-Reply-To: References: Message-ID: We're dealing with this same hassle at Slant. For starters, instead of your hello-world build, you can use: stack build --dependencies-only We've gone a few steps farther to make Kubernetes deployment easier. We have a base container for web services with this Dockerfile: FROM haskell:8.2.1 as buildenv WORKDIR /app RUN apt-get update && apt-get -y install make xz-utils libpq-dev RUN stack upgrade --binary-version 1.6.1 RUN stack update && stack setup --resolver lts-11.22 RUN stack build Cabal --resolver lts-11.22 RUN stack build haskell-src-exts --resolver lts-11.22 RUN stack build lens --resolver lts-11.22 RUN stack build aeson --resolver lts-11.22 RUN stack build http-conduit --resolver lts-11.22 RUN stack build servant-server --resolver lts-11.22 We'll occasionally make a new version of this with a bumped LTS version. When we started there was no official 8.4-base docker image, but since we're using stack it doesn't really matter. The explicit install of stack-1.6.1 is due to an error that occurs when trying to pull indexes with 1.7.1 on the kubernetes build server; there's probably a cleaner solution but we're not using any 1.7.1 features yet. Optimally the order here should be "least likely to change goes first" but we haven't done any real analysis on that. >From here, individual services have a Dockerfile roughly like so: FROM debian:jessie as runtimeenv RUN apt-get update && apt-get install -y libgmp-dev && rm -rf /var/lib/apt/lists/* FROM /haskell-webserver:lts-11.22 as dependencies # pre-build any additional slow local dependencies COPY stack.yaml . COPY package.yaml . COPY submodules ./submodules RUN rm -rf ~/.stack/indices/ RUN stack build --dependencies-only FROM dependencies as build COPY . . RUN stack build --ghc-options="-O2" # additional options as desired FROM runtimeenv WORKDIR /app COPY --from=build /app/.stack-work/install/x86_64-linux/lts-11.22/8.2.2/bin/app-name-exe . CMD ./app-name-exe This gives us a reasonable build speed and nice lean deploy container. On Tue, Sep 4, 2018 at 4:22 AM Daniel Rolls via Haskell-Cafe < haskell-cafe at haskell.org> wrote: > I am building Haskell web services using docker. I do this utilising the > stack tool within the official Haskell Docker image. Doing this has a > number of advantages > > - It's predictable - I can control bringing in new versions of stack and > ghc. > - It's easy to share - no stack or Haskell installation is required to > build the image > - These services can interact with other services in a controlled > environment using Docker compose. > > One downside is that I doing this naively means waiting for a modules to > download every time docker build is run since the stack build command will > start from scratch each time. I worked around this by adding two build > steps. The first runs a hello world example with the stack.yml and > package.yml from my project. The second build builds the intended code. > This trick separated the downloading of dependent libraries from building > the code and means day to day code changes are quick to build since the > first stack build is cached as a docker image layer. Still though, making a > small change to a stack yml file means a long wait re-downloading all > dependent libraries. > > Firstly, is there a better way of doing this? I know stack supports > building within docker but when I worked this way I still found projects > depended on the system stack and would commonly fail due to stack bugs. > > Secondly, my two "stack build" calls where the first builds a hello world > program feels a bit hacky. is there a better way to achieve this? > > Thanks, > Dan > > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From tiredpixel at posteo.de Tue Sep 4 14:47:24 2018 From: tiredpixel at posteo.de (tiredpixel) Date: Tue, 04 Sep 2018 16:47:24 +0200 Subject: [Haskell-cafe] Building stack projects as docker images In-Reply-To: References: Message-ID: <236e03fab3eb16e2d53fcbcdf15c7cd0f61974bd.camel@posteo.de> Dear Dan, On Tue, 2018-09-04 at 12:21 +0100, Daniel Rolls via Haskell-Cafe wrote: > Secondly, my two "stack build" calls where the first builds a hello > world program feels a bit hacky. is there a better way to achieve > this? I cannot comment on the Stack parts of your questions, since I don't currently use it. Regarding the Docker parts, however, have you tried something like COPY [ \ "cabal.config", \ "*.cabal", \ "./"] RUN cabal update \ && \ cabal install -j --only-dependencies prior to your main compilation layer? That would install the dependencies once, with the layer cache expiring on change to the lib declarations, without the need for the 'hello world' program you describe. In case you don't freeze your dependencies, you can likely remove the `cabal.config` line. Peace, tiredpixel -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 488 bytes Desc: This is a digitally signed message part URL: From evan at evanrutledgeborden.dreamhosters.com Tue Sep 4 22:22:52 2018 From: evan at evanrutledgeborden.dreamhosters.com (evan@evan-borden.com) Date: Tue, 4 Sep 2018 18:22:52 -0400 Subject: [Haskell-cafe] ANN: network 2.8.0.0 Message-ID: Announcing the release of network 2.8.0.0 http://hackage.haskell.org/package/network-2.8.0.0 Version 2.8.0.0 * Breaking change: PortNumber originally contained Word32 in network byte order and used "deriving Ord". This results in strange behavior on the Ord instance. Now PortNumber holds Word32 in host byte order. [#347](https://github.com/haskell/network/pull/347) * Breaking change: stopping the export of the PortNum constructor in PortNumber. * Use bytestring == 0.10.* only. * Use base >= 4.7 && < 5. -- Evan Borden -------------- next part -------------- An HTML attachment was scrubbed... URL: From ietf-dane at dukhovni.org Wed Sep 5 00:52:31 2018 From: ietf-dane at dukhovni.org (Viktor Dukhovni) Date: Tue, 4 Sep 2018 20:52:31 -0400 Subject: [Haskell-cafe] ANN: network 2.8.0.0 In-Reply-To: References: Message-ID: > On Sep 4, 2018, at 6:22 PM, evan at evan-borden.com wrote: > > * Breaking change: PortNumber originally contained Word32 in network > byte order and used "deriving Ord". This results in strange behavior > on the Ord instance. Now PortNumber holds Word32 in host byte order. > [#347](https://github.com/haskell/network/pull/347) > * Breaking change: stopping the export of the PortNum constructor in > PortNumber. Small correction, the port number of course was and remains Word16 not Word32. One might also note that to the extent that applications did use the PortNumber in SockAddrInet, and SockAddrInet6 structures, they likely relied on the Num, Enum and Integral instances, rather than using "PortNum" directly. Those instances made it possible to use numeric literals or "fromIntegral" to create "PortNumber" from port numbers. And this has not changed. The only thing that's no longer possible is direct use of the "PortNum" constructor with values already in network byte order. This should be a rare impediment, and the removal of the constructor means that all problems are detected at compile-time. So this is unlikely to be an issue for the majority of applications, which likely just use getaddrinfo() in any case, and don't build SockAddrInet* structures "by hand". -- Viktor. From kazu at iij.ad.jp Wed Sep 5 02:40:50 2018 From: kazu at iij.ad.jp (Kazu Yamamoto (=?iso-2022-jp?B?GyRCOzNLXE9CSScbKEI=?=)) Date: Wed, 05 Sep 2018 11:40:50 +0900 (JST) Subject: [Haskell-cafe] ANN: network 2.8.0.0 In-Reply-To: References: Message-ID: <20180905.114050.1100205920779325764.kazu@iij.ad.jp> Hi Viktor, > Small correction, the port number of course was and remains Word16 > not Word32. Right. This is my fault. I have fixed this typo in github. > One might also note that to the extent that applications did use the > PortNumber in SockAddrInet, and SockAddrInet6 structures, they likely > relied on the Num, Enum and Integral instances, rather than using > "PortNum" directly. > > Those instances made it possible to use numeric literals or "fromIntegral" > to create "PortNumber" from port numbers. And this has not changed. The > only thing that's no longer possible is direct use of the "PortNum" constructor > with values already in network byte order. This should be a rare impediment, > and the removal of the constructor means that all problems are detected at > compile-time. > > So this is unlikely to be an issue for the majority of applications, which > likely just use getaddrinfo() in any case, and don't build SockAddrInet* > structures "by hand". Thank you for your explanation! This is exactly what we the maintainers intend. --Kazu From 78emil at gmail.com Wed Sep 5 08:33:15 2018 From: 78emil at gmail.com (Emil Axelsson) Date: Wed, 5 Sep 2018 10:33:15 +0200 Subject: [Haskell-cafe] Senior Haskell Developer opening at Mpowered Message-ID: <88da981e-8802-77a2-52db-f64fe64ea9c1@gmail.com> We are looking for a Haskeller to help us move our application forward using Haskell. We're a remote-only team based in South Africa and Sweden. Bonus if you've used some Nix or Ruby before! https://mpowered.co.za/jobs/ / Emil From johannes.waldmann at htwk-leipzig.de Wed Sep 5 10:55:34 2018 From: johannes.waldmann at htwk-leipzig.de (Johannes Waldmann) Date: Wed, 5 Sep 2018 12:55:34 +0200 Subject: [Haskell-cafe] how to Enumerate without lists? In-Reply-To: References: <9936a92f-08a6-1403-1153-a8db779d0cf8@htwk-leipzig.de> Message-ID: <7fac79d6-e17b-ff95-0114-49a3ecf2392c@htwk-leipzig.de> Hi David, Thanks for responding. Let me re-phrase the technical question: in some hypothetical >   instance Foldable Enumerator where ... the methods (e.g., foldMap) would be overconstrained. Is there a way to still write something like it? It seems not, as shown by these examples: Data.EnumSet cannot implement Foldable because of Enum k. http://hackage.haskell.org/package/enummapset/docs/Data-EnumSet.html Data.IntSet cannot implement Foldable because of k ~ Int. - J. From mail at joachim-breitner.de Wed Sep 5 12:22:48 2018 From: mail at joachim-breitner.de (Joachim Breitner) Date: Wed, 05 Sep 2018 14:22:48 +0200 Subject: [Haskell-cafe] What is the best way to search information on Haskell Cafe? In-Reply-To: References: Message-ID: <4419419a31e04c40fecede7ddb7b10fa4b61349f.camel@joachim-breitner.de> Hi, Am Freitag, den 31.08.2018, 13:59 -0300 schrieb Rodrigo Stevaux: > I am new to mailing lists. Like really new. The Haskell Wiki lists a number of archives of the mailing list at https://wiki.haskell.org/Mailing_lists#Archiving The prettiest, most comprehensive search might be provided by https://haskell.markmail.org/ Joachim -- Joachim “nomeata” Breitner mail at joachim-breitner.de https://www.joachim-breitner.de/ -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 833 bytes Desc: This is a digitally signed message part URL: From mgajda at mimuw.edu.pl Wed Sep 5 13:25:43 2018 From: mgajda at mimuw.edu.pl (Michal J Gajda) Date: Wed, 5 Sep 2018 15:25:43 +0200 Subject: [Haskell-cafe] Fwd: Setting RTS thread names useful? In-Reply-To: References: Message-ID: It seems `GHC.Conc.labelThread` does not have corresponding `getThreadLabel`. It would be useful when redirecting all exceptions to `journald` etc. The problem might be, that this function is documented to function in debugging builds only :-(. Just like stack traces which need special options. -------------- next part -------------- An HTML attachment was scrubbed... URL: From zemyla at gmail.com Wed Sep 5 15:04:16 2018 From: zemyla at gmail.com (Zemyla) Date: Wed, 5 Sep 2018 10:04:16 -0500 Subject: [Haskell-cafe] how to Enumerate without lists? In-Reply-To: <7fac79d6-e17b-ff95-0114-49a3ecf2392c@htwk-leipzig.de> References: <9936a92f-08a6-1403-1153-a8db779d0cf8@htwk-leipzig.de> <7fac79d6-e17b-ff95-0114-49a3ecf2392c@htwk-leipzig.de> Message-ID: You could always do a Coyoneda transform. data IntSetF a = IntSetF !IntSet (Int -> a) The Functor and Foldable instances are pretty obvious from it. Similarly with your Enumerator idea. On Wed, Sep 5, 2018, 05:56 Johannes Waldmann < johannes.waldmann at htwk-leipzig.de> wrote: > Hi David, > > Thanks for responding. > Let me re-phrase the technical question: in some hypothetical > > > instance Foldable Enumerator where ... > > the methods (e.g., foldMap) would be overconstrained. > Is there a way to still write something like it? > > It seems not, as shown by these examples: > > Data.EnumSet cannot implement Foldable because of Enum k. > http://hackage.haskell.org/package/enummapset/docs/Data-EnumSet.html > > Data.IntSet cannot implement Foldable because of k ~ Int. > > - J. > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From benjamin.peter.doyle at gmail.com Wed Sep 5 15:37:07 2018 From: benjamin.peter.doyle at gmail.com (Ben Doyle) Date: Wed, 5 Sep 2018 11:37:07 -0400 Subject: [Haskell-cafe] how to Enumerate without lists? In-Reply-To: <7fac79d6-e17b-ff95-0114-49a3ecf2392c@htwk-leipzig.de> References: <9936a92f-08a6-1403-1153-a8db779d0cf8@htwk-leipzig.de> <7fac79d6-e17b-ff95-0114-49a3ecf2392c@htwk-leipzig.de> Message-ID: Try this: {-#LANGUAGE GADTs#-} data Enumerator a b where Enumerator :: a -> a -> Enumerator a a instance Enum a => Foldable (Enumerator a) where foldMap f (Enumerator x y) | fromEnum x > fromEnum y = mempty | otherwise = f x <> foldMap f (Enumerator (succ x) y) Here we're using a GADT to express that our two-parameter Enumerator type in practice always has a == b (at the type level). Which lets us constrain the values inside our new Foldable structure, while still having a type of kind (* -> *) like the the typeclass requires. On Wed, Sep 5, 2018 at 6:56 AM Johannes Waldmann < johannes.waldmann at htwk-leipzig.de> wrote: > Hi David, > > Thanks for responding. > Let me re-phrase the technical question: in some hypothetical > > > instance Foldable Enumerator where ... > > the methods (e.g., foldMap) would be overconstrained. > Is there a way to still write something like it? > > It seems not, as shown by these examples: > > Data.EnumSet cannot implement Foldable because of Enum k. > http://hackage.haskell.org/package/enummapset/docs/Data-EnumSet.html > > Data.IntSet cannot implement Foldable because of k ~ Int. > > - J. > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From johnz at pleasantnightmare.com Wed Sep 5 16:35:19 2018 From: johnz at pleasantnightmare.com (John Z.) Date: Wed, 5 Sep 2018 12:35:19 -0400 Subject: [Haskell-cafe] Stack project, dependency with build flags Message-ID: <20180905163519.GA18119@johns-machine.localdomain> Hi everyone, I'll try with tl:dr; first: I have a project with a dependency that -on my system- needs a build flag, in order to correctly link against its native dependencies. What is a proper way to set up a project like this? A longer explanation: My project uses `ncurses` library and I'm running Arch. There seems to be some religious opinion going around about the layout of ncurses headers in /usr/include, and -naturally- Arch and `ncurses` disagree. Fortunately, author of `ncurses` included a build flag to allow (easy) build on systems which follow the 'wrong' layout. Thing is, I couldn't figure out how to add this configuration to the .cabal of my own project, so for now, I'm stuck with a Makefile that invokes `stack` commands with `--flag ncurses:force-narrow-library`. I looked in the manual and there doesn't seem to be a way to handle this aside the commandline argument. So this got me thinking - is there a different way projects like these are supposed to be organized? For example, am I supposed to check out `ncurses` as a submodule of my project, then replicate this build flag on my own project and 'pass it down' somehow? Any tips are highly appreciated; I'm really not proficient with organizing stack projects properly. Cheers! -- "That gum you like is going to come back in style." From michael at snoyman.com Wed Sep 5 16:46:26 2018 From: michael at snoyman.com (Michael Snoyman) Date: Wed, 5 Sep 2018 19:46:26 +0300 Subject: [Haskell-cafe] Stack project, dependency with build flags In-Reply-To: <20180905163519.GA18119@johns-machine.localdomain> References: <20180905163519.GA18119@johns-machine.localdomain> Message-ID: IIUC, you can set additional flags in your stack.yaml file: flags: ncurses: flagname: true # or false On Wed, Sep 5, 2018 at 7:35 PM John Z. wrote: > Hi everyone, > I'll try with tl:dr; first: > I have a project with a dependency that -on my system- needs a build > flag, in order to correctly link against its native dependencies. > What is a proper way to set up a project like this? > > A longer explanation: > My project uses `ncurses` library and I'm running Arch. There seems > to be some religious opinion going around about the layout of > ncurses headers in /usr/include, and -naturally- Arch and `ncurses` > disagree. Fortunately, author of `ncurses` included a build flag to > allow (easy) build on systems which follow the 'wrong' layout. > > Thing is, I couldn't figure out how to add this configuration to the > .cabal of my own project, so for now, I'm stuck with a Makefile that > invokes `stack` commands with `--flag ncurses:force-narrow-library`. > I looked in the manual and there doesn't seem to be a way to handle > this aside the commandline argument. > > So this got me thinking - is there a different way projects like > these are supposed to be organized? For example, am I supposed to > check out `ncurses` as a submodule of my project, then replicate > this build flag on my own project and 'pass it down' somehow? > > Any tips are highly appreciated; I'm really not proficient with > organizing stack projects properly. > > Cheers! > > > > -- > "That gum you like is going to come back in style." > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From trebla at vex.net Wed Sep 5 20:09:49 2018 From: trebla at vex.net (Albert Y. C. Lai) Date: Wed, 5 Sep 2018 16:09:49 -0400 Subject: [Haskell-cafe] small example for space-efficiency via lazy evaluation? In-Reply-To: <7af4c401-4716-e5e0-1ee3-3be95ea0ed3e@htwk-leipzig.de> References: <7af4c401-4716-e5e0-1ee3-3be95ea0ed3e@htwk-leipzig.de> Message-ID: <40f5acda-1449-aca1-7a7f-1f81ac7c2e68@vex.net> On 2018-09-04 12:36 PM, Johannes Waldmann wrote: > main = print $ sum $ map (^ 2) $ [ 1 :: Int .. 10^8 ] [...] > So, my question is, what do you use as a (teaching) example > for space-efficiency via lazy evaluation? I would skip the summing. Print the whole list. Sure it would take forever to finish, but during that time I would also fire up htop or something to show how much memory the process doesn't use as it progresses. And change Int to Integer and bump up the upper bound to 10^12 or something --- or even have no upper bound at all. And point out how the printing starts right away as opposed to "waiting for the whole list to be completely built before printing begins". And for the sake of engagement, before running the experiment, invite the students to make predictions about how much memory, how it grows, how much time before the printing begins, etc. Learning does not happen by nodding. Learn happens by dropping your jaw all the time. In my class I used these two other examples (because I didn't want to do I/O yet): doITerminate = take 2 (foo 0) where foo n = n : foo (n + 1) doIEvenMakeSense = take 2 foo where foo = 0 : foo They're merely "take 2" because next I also had to showed the detailed steps of lazy evaluation. It would be boring to go "take 10". From tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk Thu Sep 6 08:18:09 2018 From: tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk (Tom Ellis) Date: Thu, 6 Sep 2018 09:18:09 +0100 Subject: [Haskell-cafe] how to Enumerate without lists? In-Reply-To: References: <9936a92f-08a6-1403-1153-a8db779d0cf8@htwk-leipzig.de> <7fac79d6-e17b-ff95-0114-49a3ecf2392c@htwk-leipzig.de> Message-ID: <20180906081809.4wn3yqlgo7jp6fah@weber> On Wed, Sep 05, 2018 at 11:37:07AM -0400, Ben Doyle wrote: > {-#LANGUAGE GADTs#-} > > data Enumerator a b where > Enumerator :: a -> a -> Enumerator a a > > instance Enum a => Foldable (Enumerator a) where > foldMap f (Enumerator x y) > | fromEnum x > fromEnum y = mempty > | otherwise = f x <> foldMap f (Enumerator > (succ x) y) > > Here we're using a GADT to express that our two-parameter Enumerator type > in practice always has a == b (at the type level). > Which lets us constrain the values inside our new Foldable structure, while > still having a type of kind (* -> *) like the the > typeclass requires. This is ingenious! From petr.mvd at gmail.com Thu Sep 6 09:42:49 2018 From: petr.mvd at gmail.com (=?UTF-8?B?UGV0ciBQdWRsw6Fr?=) Date: Thu, 6 Sep 2018 11:42:49 +0200 Subject: [Haskell-cafe] modules as implicit data structures Message-ID: Hi Cafe, Let's say we have a module like module Foo where foo, bar :: String -> Something ... and use it elsewhere as import qualified Foo as F ... F.foo ... This is almost just if we had a data type defined as data FooModule = FooModule { foo :: String -> Something, bar :: String -> Something } together with a singleton `f : :FooModule` and referenced it as f.foo, f.bar etc. Has some language explored this idea of making modules explicit as language-level objects? It seems that there could be some interesting possibilities, such as: - Abstract modules (just a definition of the data type). Then - Being able to replace a module with a different one (like a fake one for testing). - Polymorphic modules that could be instantiated for specific types. Thanks, Petr -------------- next part -------------- An HTML attachment was scrubbed... URL: From tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk Thu Sep 6 10:24:18 2018 From: tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk (Tom Ellis) Date: Thu, 6 Sep 2018 11:24:18 +0100 Subject: [Haskell-cafe] modules as implicit data structures In-Reply-To: References: Message-ID: <20180906102418.2vhnhr243geywbko@weber> On Thu, Sep 06, 2018 at 11:42:49AM +0200, Petr Pudlák wrote: > Let's say we have a module like > > module Foo where > > foo, bar :: String -> Something > ... > > and use it elsewhere as > > import qualified Foo as F > ... > F.foo ... > > This is almost just if we had a data type defined as > > data FooModule = FooModule { foo :: String -> Something, bar :: String -> > Something } Related reading http://www.haskellforall.com/2012/07/first-class-modules-without-defaults.html From marc at lamarciana.com Thu Sep 6 10:42:14 2018 From: marc at lamarciana.com (=?ISO-8859-15?Q?Marc_Busqu=E9?=) Date: Thu, 6 Sep 2018 12:42:14 +0200 (CEST) Subject: [Haskell-cafe] Parsing LocalTime from Unix seconds Message-ID: In GHCi ``` :m +Data.Time parseTimeM True defaultTimeLocale "%s" "1535684406" :: Maybe UTCTime -- => Just 2018-08-31 03:00:06 UTC parseTimeM True defaultTimeLocale "%s" "1535684406" :: Maybe LocalTime -- => Just 1970-01-01 00:00:00 ``` Why? ¯\(°_o)/¯ Marc Busqué http://waiting-for-dev.github.io/about/ From leiva.steven at gmail.com Thu Sep 6 11:51:14 2018 From: leiva.steven at gmail.com (Steven Leiva) Date: Thu, 6 Sep 2018 06:51:14 -0500 Subject: [Haskell-cafe] modules as implicit data structures In-Reply-To: <20180906102418.2vhnhr243geywbko@weber> References: <20180906102418.2vhnhr243geywbko@weber> Message-ID: In Elixir/Erlang, module references are just atoms. ( https://elixir-lang.org/getting-started/basic-types.html#atoms) You could bind a name to one module and then rebind it to another module. On Thu, Sep 6, 2018 at 5:24 AM Tom Ellis < tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk> wrote: > On Thu, Sep 06, 2018 at 11:42:49AM +0200, Petr Pudlák wrote: > > Let's say we have a module like > > > > module Foo where > > > > foo, bar :: String -> Something > > ... > > > > and use it elsewhere as > > > > import qualified Foo as F > > ... > > F.foo ... > > > > This is almost just if we had a data type defined as > > > > data FooModule = FooModule { foo :: String -> Something, bar :: String -> > > Something } > > Related reading > > > http://www.haskellforall.com/2012/07/first-class-modules-without-defaults.html > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -- Steven Leiva 305.528.6038 leiva.steven at gmail.com http://www.linkedin.com/in/stevenleiva -------------- next part -------------- An HTML attachment was scrubbed... URL: From johnz at pleasantnightmare.com Thu Sep 6 12:51:08 2018 From: johnz at pleasantnightmare.com (John Z.) Date: Thu, 6 Sep 2018 08:51:08 -0400 Subject: [Haskell-cafe] Stack project, dependency with build flags In-Reply-To: References: <20180905163519.GA18119@johns-machine.localdomain> Message-ID: <20180906125108.GA9465@johnslap> > IIUC, you can set additional flags in your stack.yaml file: > > flags: > ncurses: > flagname: true # or false > Thank you, that was exactly what I needed. I wasn't aware there's a `flags` section in stack.yaml. -- "That gum you like is going to come back in style." From svenpanne at gmail.com Thu Sep 6 13:00:54 2018 From: svenpanne at gmail.com (Sven Panne) Date: Thu, 6 Sep 2018 15:00:54 +0200 Subject: [Haskell-cafe] modules as implicit data structures In-Reply-To: References: Message-ID: Am Do., 6. Sep. 2018 um 11:43 Uhr schrieb Petr Pudlák : > [...] Has some language explored this idea of making modules explicit as > language-level objects? It seems that there could be some interesting > possibilities, such as: [...] > You might want to have a look at Standard ML's signatures, structures and functors, they are probably what you're thinking about. Cheers, S. -------------- next part -------------- An HTML attachment was scrubbed... URL: From michael at snoyman.com Thu Sep 6 15:00:14 2018 From: michael at snoyman.com (Michael Snoyman) Date: Thu, 6 Sep 2018 18:00:14 +0300 Subject: [Haskell-cafe] Stack project, dependency with build flags In-Reply-To: <20180906125108.GA9465@johnslap> References: <20180905163519.GA18119@johns-machine.localdomain> <20180906125108.GA9465@johnslap> Message-ID: No problem. On Thu, Sep 6, 2018 at 3:51 PM John Z. wrote: > > IIUC, you can set additional flags in your stack.yaml file: > > > > flags: > > ncurses: > > flagname: true # or false > > > > Thank you, that was exactly what I needed. I wasn't aware there's a > `flags` section in stack.yaml. > > > -- > "That gum you like is going to come back in style." > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From saurabhnanda at gmail.com Thu Sep 6 16:22:46 2018 From: saurabhnanda at gmail.com (Saurabh Nanda) Date: Thu, 6 Sep 2018 21:52:46 +0530 Subject: [Haskell-cafe] Stack project, dependency with build flags In-Reply-To: References: <20180905163519.GA18119@johns-machine.localdomain> <20180906125108.GA9465@johnslap> Message-ID: Additionally you can also pass these flags via the command line. You might have to keep a different set of flags in your git repo, and use a different set of flags for your local dev. I had a similar problem with hlibsass and mac os. On Thu 6 Sep, 2018, 8:30 PM Michael Snoyman, wrote: > No problem. > > On Thu, Sep 6, 2018 at 3:51 PM John Z. > wrote: > >> > IIUC, you can set additional flags in your stack.yaml file: >> > >> > flags: >> > ncurses: >> > flagname: true # or false >> > >> >> Thank you, that was exactly what I needed. I wasn't aware there's a >> `flags` section in stack.yaml. >> >> >> -- >> "That gum you like is going to come back in style." >> _______________________________________________ >> Haskell-Cafe mailing list >> To (un)subscribe, modify options or view archives go to: >> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe >> Only members subscribed via the mailman list are allowed to post. > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From johannes.waldmann at htwk-leipzig.de Fri Sep 7 15:24:54 2018 From: johannes.waldmann at htwk-leipzig.de (Johannes Waldmann) Date: Fri, 7 Sep 2018 17:24:54 +0200 Subject: [Haskell-cafe] common class for Set (and Map, resp.) implementations with different constraints on the keys Message-ID: <167b4a4c-775e-f89c-fe59-e6b6d3f37981@htwk-leipzig.de> Dear Cafe, we have Data.Set, Data.IntSet, Data.HashSet, and they all have similar API, where the only difference is the constraint on the elements. (Same thing for maps.) Can we unify this as follows: {-# language ConstraintKinds, TypeFamilies #-} class SetC s where type Con s :: * -> Constraint singleton :: (Con s a) => a -> s a foldMap :: (Con s a, Monoid m) => (a -> m) -> s a -> m ... Then for Data.Set, we write instance SetC S.Set where type Con S.Set = Ord ; ... It seems to work, and it allows me to write polymorphic code, and switch implementations from the top. Full source: https://gitlab.imn.htwk-leipzig.de/waldmann/pure-matchbox/tree/master/src/Data/Set Example use case (switch implementation): https://gitlab.imn.htwk-leipzig.de/waldmann/pure-matchbox/blob/master/src/Matchbox/Tiling/Working.hs#L48 Still, there are some clumsy corners in this code, perhaps you can help: * for instance SetC HashSet, there are two constraints. I want to write type Con HashSet = \ e -> (Hashable e, Eq, e) but this does not work (there is no "type lambda"?) * for maps, I want to write class (forall k . Foldable m k) => MapC m but this seems impossible now (This is would work with -XQuantifiedConstraints ?) * in some other code using the same idea (the class exports the constraint), I had an instance where the constraint was empty. Again, I cannot write type Con Foo = \ s -> () - J.W. From lysxia at gmail.com Fri Sep 7 15:33:38 2018 From: lysxia at gmail.com (Li-yao Xia) Date: Fri, 7 Sep 2018 11:33:38 -0400 Subject: [Haskell-cafe] common class for Set (and Map, resp.) implementations with different constraints on the keys In-Reply-To: <167b4a4c-775e-f89c-fe59-e6b6d3f37981@htwk-leipzig.de> References: <167b4a4c-775e-f89c-fe59-e6b6d3f37981@htwk-leipzig.de> Message-ID: <1ebb46eb-c69a-cc2a-51ec-d871406d9b7d@gmail.com> You can define classes to serve as "constraint combinators", that can be partially applied: {- Unary Constraint conjunction -} class (c a, d a) => (&) (c :: k -> Constraint) (d :: k -> Constraint) (a :: k) instance (c a, d a) => (&) c d a {- Unary empty constraint -} class Empty a instance Empty a Now you can write type Con HashSet = Hashable & Eq type Con Foo = Empty Another alternative is to "eta-expand" the synonym Con: class SetC s where type Con s a :: Constraint class ... type Con HashSet a = (Hashable a, Eq a) One issue with that is that Con cannot be partially applied. Li-yao On 9/7/18 11:24 AM, Johannes Waldmann wrote: > Dear Cafe, > > > we have Data.Set, Data.IntSet, Data.HashSet, > and they all have similar API, where the only difference > is the constraint on the elements. (Same thing for maps.) > > Can we unify this as follows: > > {-# language ConstraintKinds, TypeFamilies #-} > class SetC s where > type Con s :: * -> Constraint > singleton :: (Con s a) => a -> s a > foldMap :: (Con s a, Monoid m) => (a -> m) -> s a -> m > ... > > Then for Data.Set, we write > > instance SetC S.Set where type Con S.Set = Ord ; ... > > It seems to work, and it allows me to write polymorphic code, > and switch implementations from the top. > Full source: > https://gitlab.imn.htwk-leipzig.de/waldmann/pure-matchbox/tree/master/src/Data/Set > Example use case (switch implementation): > https://gitlab.imn.htwk-leipzig.de/waldmann/pure-matchbox/blob/master/src/Matchbox/Tiling/Working.hs#L48 > > > > Still, there are some clumsy corners in this code, perhaps you can help: > > > * for instance SetC HashSet, there are two constraints. I want to write > > type Con HashSet = \ e -> (Hashable e, Eq, e) > > but this does not work (there is no "type lambda"?) > > > * for maps, I want to write > > class (forall k . Foldable m k) => MapC m > > but this seems impossible now (This is would work > with -XQuantifiedConstraints ?) > > > * in some other code using the same idea (the class exports the > constraint), I had an instance where the constraint was empty. > > Again, I cannot write type Con Foo = \ s -> () > > > - J.W. > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. > From david.feuer at gmail.com Fri Sep 7 15:51:01 2018 From: david.feuer at gmail.com (David Feuer) Date: Fri, 7 Sep 2018 11:51:01 -0400 Subject: [Haskell-cafe] common class for Set (and Map, resp.) implementations with different constraints on the keys In-Reply-To: <167b4a4c-775e-f89c-fe59-e6b6d3f37981@htwk-leipzig.de> References: <167b4a4c-775e-f89c-fe59-e6b6d3f37981@htwk-leipzig.de> Message-ID: In my opinion, such a class should usually have more than one parameter. In the case of Set, I think it makes more sense to use a value type than a constraint type. class e ~ Elem s => SetC e s where type Elem s :: Type type Elem (_ a) = a singleton :: e -> s elem :: e -> s -> Bool union :: s -> s -> s ... instance Ord a => SetC a (S.Set a) where singleton = S.singleton ... instance a ~ Int => SetC a IntSet where type Elem IntSet = Int ... For maps, you can do something similar: class k ~ Key m => MapC k m where type Key m :: Type type Key (_ k) = k lookup :: k -> m a -> Maybe a ... instance Ord k => MapC k (M.Map k) where lookup = M.lookup .... instance k ~ Int => MapC k IM.IntMap where type Key IntMap = Int lookup = IM.lookup If you like, you can add some constraints, like Traversable m. If you want to use MFoldable for sets, you can use its Element type family instead of Elem. On Fri, Sep 7, 2018, 11:25 AM Johannes Waldmann < johannes.waldmann at htwk-leipzig.de> wrote: > Dear Cafe, > > > we have Data.Set, Data.IntSet, Data.HashSet, > and they all have similar API, where the only difference > is the constraint on the elements. (Same thing for maps.) > > Can we unify this as follows: > > {-# language ConstraintKinds, TypeFamilies #-} > class SetC s where > type Con s :: * -> Constraint > singleton :: (Con s a) => a -> s a > foldMap :: (Con s a, Monoid m) => (a -> m) -> s a -> m > ... > > Then for Data.Set, we write > > instance SetC S.Set where type Con S.Set = Ord ; ... > > It seems to work, and it allows me to write polymorphic code, > and switch implementations from the top. > Full source: > > https://gitlab.imn.htwk-leipzig.de/waldmann/pure-matchbox/tree/master/src/Data/Set > Example use case (switch implementation): > > https://gitlab.imn.htwk-leipzig.de/waldmann/pure-matchbox/blob/master/src/Matchbox/Tiling/Working.hs#L48 > > > > Still, there are some clumsy corners in this code, perhaps you can help: > > > * for instance SetC HashSet, there are two constraints. I want to write > > type Con HashSet = \ e -> (Hashable e, Eq, e) > > but this does not work (there is no "type lambda"?) > > > * for maps, I want to write > > class (forall k . Foldable m k) => MapC m > > but this seems impossible now (This is would work > with -XQuantifiedConstraints ?) > > > * in some other code using the same idea (the class exports the > constraint), I had an instance where the constraint was empty. > > Again, I cannot write type Con Foo = \ s -> () > > > - J.W. > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From johannes.waldmann at htwk-leipzig.de Fri Sep 7 18:12:25 2018 From: johannes.waldmann at htwk-leipzig.de (waldmann) Date: Fri, 7 Sep 2018 20:12:25 +0200 Subject: [Haskell-cafe] common class for Set (and Map, resp.) implementations with different constraints on the keys In-Reply-To: References: <167b4a4c-775e-f89c-fe59-e6b6d3f37981@htwk-leipzig.de> Message-ID: On 09/07/2018 05:51 PM, David Feuer wrote: > class e ~ Elem s => SetC e s where OK. At the use site, under both proposals, there'll be a two argument constraint. In my version, the second argument was curried away. One way or the other - why don't we? What could be the downsides here? I guess since it's meant to sit atop (some) modules from various packages (containers, unordered-containers, enummapset) it's best to release it as a separate package, containing the classes, and orphan instances. - J. From yotam2206 at gmail.com Fri Sep 7 18:35:54 2018 From: yotam2206 at gmail.com (Yotam Ohad) Date: Fri, 7 Sep 2018 21:35:54 +0300 Subject: [Haskell-cafe] Passing creation flags to CreateProcess on windows Message-ID: Hello cafe, I'm working on a small debugger so I would like to use the winapi CreateProcess function with the appropriate flags for debugging. Unfortunately, it seems that in Haskell there is no way to pass the flags. I would like to not use LoadLibrary to get every function I need. Thanks Yotam -------------- next part -------------- An HTML attachment was scrubbed... URL: From allbery.b at gmail.com Fri Sep 7 18:42:17 2018 From: allbery.b at gmail.com (Brandon Allbery) Date: Fri, 7 Sep 2018 14:42:17 -0400 Subject: [Haskell-cafe] Passing creation flags to CreateProcess on windows In-Reply-To: References: Message-ID: System.Process only supports, and only can support, things that can reasonably be used or emulated on both Unix and Windows. You may want to look into https://downloads.haskell.org/~ghc/latest/docs/html/libraries/Win32-2.6.1.0/System-Win32-Process.html and friends. On Fri, Sep 7, 2018 at 2:36 PM Yotam Ohad wrote: > Hello cafe, > > I'm working on a small debugger so I would like to use the winapi > CreateProcess function with the appropriate flags for debugging. > Unfortunately, it seems that in Haskell > there > is no way to pass the flags. > I would like to not use LoadLibrary to get every function I need. > > Thanks > Yotam > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -- brandon s allbery kf8nh allbery.b at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.feuer at gmail.com Fri Sep 7 18:54:25 2018 From: david.feuer at gmail.com (David Feuer) Date: Fri, 7 Sep 2018 14:54:25 -0400 Subject: [Haskell-cafe] common class for Set (and Map, resp.) implementations with different constraints on the keys In-Reply-To: References: <167b4a4c-775e-f89c-fe59-e6b6d3f37981@htwk-leipzig.de> Message-ID: The instances won't be orphans if they're in the same module as the class definition. On Fri, Sep 7, 2018, 2:12 PM waldmann wrote: > On 09/07/2018 05:51 PM, David Feuer wrote: > > > class e ~ Elem s => SetC e s where > > OK. At the use site, under both proposals, > there'll be a two argument constraint. > In my version, the second argument was curried away. > > One way or the other - why don't we? > What could be the downsides here? > > I guess since it's meant to sit atop (some) modules > from various packages (containers, unordered-containers, enummapset) > it's best to release it as a separate package, > containing the classes, and orphan instances. > > - J. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rudy at matela.com.br Sat Sep 8 23:54:18 2018 From: rudy at matela.com.br (Rudy Matela) Date: Sat, 8 Sep 2018 20:54:18 -0300 Subject: [Haskell-cafe] ANN: leancheck-v0.7.4 with providers for Tasty, Hspec and test-framework Message-ID: <20180908235418.GA2205@zero.localdomain> Hello Haskell-Café, A new version of LeanCheck is out: v0.7.4. LeanCheck is a property testing library (like QuickCheck) that tests values enumeratively rather than at random (unlike QuickCheck). _Whats new?_ I have created providers to make it easy to incorporate LeanCheck properties in Tasty, Hspec and test-framework test suites. This is optional and provided in separate packages: LeanCheck can still be used alone. LeanCheck and its framework providers are available on Hackage. You can install them with: $ cabal install leancheck $ cabal install tasty-leancheck $ cabal install hspec-leancheck $ cabal install test-framework-leancheck Check out the README files for examples of use: * https://github.com/rudymatela/leancheck * https://github.com/rudymatela/tasty-leancheck * https://github.com/rudymatela/hspec-leancheck * https://github.com/rudymatela/test-framework-leancheck Here's a sneak peek of LeanCheck+Tasty: ## Test program import Test.Tasty import Test.Tasty.LeanCheck as LC import Data.List main :: IO () main = defaultMain tests tests :: TestTree tests = testGroup "Test properties checked by LeanCheck" [ LC.testProperty "sort == sort . reverse" $ \list -> sort (list :: [Int]) == sort (reverse list) , LC.testProperty "Fermat's little theorem" $ \x -> ((x :: Integer)^7 - x) `mod` 7 == 0 -- the following property do not hold , LC.testProperty "Fermat's last theorem" $ \x y z n -> (n :: Integer) >= 3 LC.==> x^n + y^n /= (z^n :: Integer) ] ## Output for the test program $ ./test Test properties checked by LeanCheck sort == sort . reverse: OK +++ OK, passed 200 tests. Fermat's little theorem: OK +++ OK, passed 200 tests. Fermat's last theorem: FAIL *** Failed! Falsifiable (after 71 tests): 0 0 0 3 1 out of 3 tests failed (0.00s) You can't see it here, but Tasty's output is highlighted in colours: "FAIL" appears in red so you can quickly see which properties failed. Best Regards, Rudy From rudy at matela.com.br Sun Sep 9 00:42:36 2018 From: rudy at matela.com.br (Rudy Matela) Date: Sat, 8 Sep 2018 21:42:36 -0300 Subject: [Haskell-cafe] ANN: leancheck-v0.7.2, enumerative QuickCheck-like testing In-Reply-To: Message-ID: <20180909004236.GB2205@zero.localdomain> Hi, On Fri, 31 Aug 2018 16:44:17 -0300, Rodrigo Stevaux wrote: > Hi, nice to see a fellow Brazilian. :-) > How does your work differ from QuickCheck? In summary, LeanCheck _enumerates_ test values whereas QuickCheck generates values _randomly_. For example consider: prop_double :: Int -> Bool prop_double x = x + x == 2 * x From the surface, QuickCheck and LeanCheck look very similar, in the following case even providing the same output: > import Test.QuickCheck > quickCheck . withMaxSuccess 10 $ prop_double +++ OK, passed 10 tests. > import Test.LeanCheck > checkFor 10 prop_double +++ OK, passed 10 tests. However, the 10 test values used by each tool differ quite a bit: > import Test.QuickCheck > sample' arbitrary :: IO [Int] [0,-2,-2,-4,-7,-4,-11,13,-3,-10,2] > sample' arbitrary :: IO [Int] [0,0,-2,4,-5,-1,-9,13,-12,-7,-12] > import Test.LeanCheck > take 10 list :: [Int] [0,1,-1,2,-2,3,-3,4,-4,5] > take 10 list :: [Int] [0,1,-1,2,-2,3,-3,4,-4,5] Since QuickCheck is random, each run tests different values, in this case, random integers. Since LeanCheck is enumerative, each run tests the same set of values, in this case integers of increasing magnitude. In both cases, if you increase the number of tests you'll be more likely to find a bug. Whether one wants enumerative or random test cases is a matter of debate. If in doubt, you can always use both... :-) The bindings for Tasty, test-framework and Hspec for LeanCheck and QuickCheck are very similar and can help in this regard. There are other differences in the library interface: I tried to keep LeanCheck's very simple. The default number of tests is different (LC: 200, QC: 100). But again, the main difference is the method of test data generation as explained above. Best Regards, Rudy PS: sorry for the delay in replying, I read haskell-café in a monthly digest and since I wasn't cc'ed, I only saw your e-mail now. From cdsmith at gmail.com Sun Sep 9 03:53:05 2018 From: cdsmith at gmail.com (Chris Smith) Date: Sat, 8 Sep 2018 23:53:05 -0400 Subject: [Haskell-cafe] Get-together at ICFP about Haskell in K-12 education Message-ID: Hello Haskell community! I've been floating an idea around about hosting a get-together in St. Louis for anyone interested in Haskell at the K-12 (pre-university) level. Some of you know that this has been a big passion of mine for the last 8 years or so. I'd love to discuss with others while a lot of us are in the same place -- whether you're interested in organized teaching, or just teaching your own kids, or just curious about the idea. I've put together this form to collect RSVPs and availability. Interest Form Thanks, Chris Smith -------------- next part -------------- An HTML attachment was scrubbed... URL: From Graham.Hutton at nottingham.ac.uk Mon Sep 10 12:51:14 2018 From: Graham.Hutton at nottingham.ac.uk (Graham Hutton) Date: Mon, 10 Sep 2018 12:51:14 +0000 Subject: [Haskell-cafe] Mathematics of Program Construction (MPC), Portugal, 2019 Message-ID: Dear all, The next Mathematics of Program Construction (MPC) conference will be held in the historic city of Porto, Portugal in October 2019, co-located with the Symposium on Formal Methods (FM). Please share, and submit your best papers! Best wishes, Graham Hutton Program Chair, MPC 2019 ====================================================================== 13th International Conference on Mathematics of Program Construction 7-9 October 2019, Porto, Portugal Co-located with Formal Methods 2019 https://tinyurl.com/MPC-Porto ====================================================================== BACKGROUND: The International Conference on Mathematics of Program Construction (MPC) aims to promote the development of mathematical principles and techniques that are demonstrably practical and effective in the process of constructing computer programs. MPC 2019 will be held in Porto, Portugal from 7-9 October 2019, and is co-located with the International Symposium on Formal Methods, FM 2019. Previous conferences were held in Königswinter, Germany (2015); Madrid, Spain (2012); Québec City, Canada (2010); Marseille, France (2008); Kuressaare, Estonia (2006); Stirling, UK (2004); Dagstuhl, Germany (2002); Ponte de Lima, Portugal (2000); Marstrand, Sweden (1998); Kloster Irsee, Germany (1995); Oxford, UK (1992); Twente, The Netherlands (1989). SCOPE: MPC seeks original papers on mathematical methods and tools put to use in program construction. Topics of interest range from algorithmics to support for program construction in programming languages and systems. Typical areas include type systems, program analysis and transformation, programming language semantics, security, and program logics. The notion of a 'program' is interpreted broadly, ranging from algorithms to hardware. Theoretical contributions are welcome, provided that their relevance to program construction is clear. Reports on applications are welcome, provided that their mathematical basis is evident. We also encourage the submission of 'programming pearls' that present elegant and instructive examples of the mathematics of program construction. IMPORTANT DATES: Abstract submission 26th April 2019 Paper submission 3rd May 2019 Author notification 14th June 2019 Camera ready copy 12th July 2019 Conference 7-9 October 2019 SUBMISSION: Submission is in two stages. Abstracts (plain text, maximum 250 words) must be submitted by 26th April 2019. Full papers (pdf, formatted using the llncs.sty style file for LaTex) must be submitted by 3rd May 2019. There is no prescribed page limit, but authors should strive for brevity. Both abstracts and papers will be submitted using EasyChair. Papers must present previously unpublished work, and not be submitted concurrently to any other publication venue. Submissions will be evaluated by the program committee according to their relevance, correctness, significance, originality, and clarity. Each submission should explain its contributions in both general and technical terms, clearly identifying what has been accomplished, explaining why it is significant, and comparing it with previous work. Accepted papers must be presented in person at the conference by one of the authors. The proceedings of MPC 2019 will be published in the Lecture Notes in Computer Science (LNCS) series, as with all previous instances of the conference. Authors of accepted papers will be expected to transfer copyright to Springer for this purpose. After the conference, authors of the best papers from MPC 2019 and MPC 2015 will be invited to submit revised versions to a special issue of Science of Computer Programming (SCP). For any queries about submission please contact the program chair, Graham Hutton . PROGRAM COMMITTEE: Patrick Bahr IT University of Copenhagen, Denmark Richard Bird University of Oxford, UK Corina Cîrstea University of Southampton, UK Brijesh Dongol University of Surrey, UK João F. Ferreira University of Lisbon, Portugal Jennifer Hackett University of Nottingham, UK William Harrison University of Missouri, USA Ralf Hinze University of Kaiserslautern, Germany Zhenjiang Hu National Institute of Informatics, Japan Graham Hutton (chair) University of Nottingham, UK Cezar Ionescu University of Oxford, UK Mauro Jaskelioff National University of Rosario, Argentina Ranjit Jhala University of California, San Diego, USA Ekaterina Komendantskaya Heriot-Watt University, UK Bernhard Möller University of Augsburg, Germany Shin-Cheng Mu Academia Sinica, Taiwan Mary Sheeran Chalmers University of Technology, Sweden Alexandra Silva University College London, UK Georg Struth University of Sheffield, UK VENUE: The conference will be held at the Alfândega Porto Congress Centre, a 150 year old former custom's house located in the historic centre of Porto on the bank of the river Douro. The venue was renovated by a Pritzer prize winning architect and has received many awards. LOCAL ORGANISERS José Nuno Oliveira University of Minho, Portugal For any queries about local issues please contact the local organiser, José Nuno Oliveira . ====================================================================== This message and any attachment are intended solely for the addressee and may contain confidential information. If you have received this message in error, please contact the sender and delete the email and attachment. Any views or opinions expressed by the author of this email do not necessarily reflect the views of the University of Nottingham. Email communications with the University of Nottingham may be monitored where permitted by law. From johannes.waldmann at htwk-leipzig.de Mon Sep 10 13:09:43 2018 From: johannes.waldmann at htwk-leipzig.de (waldmann) Date: Mon, 10 Sep 2018 15:09:43 +0200 Subject: [Haskell-cafe] ANN: leancheck-v0.7.4 with providers for Tasty, Hspec and test-framework Message-ID: <43d538ac-e8db-039c-58de-37af4281822a@htwk-leipzig.de> Hi. I do prefer the enumerative approach for test case generation over the randomized one (quickcheck) because I have use cases (testing student code) where I want to avoid IO, and I just want a pure list of (small) counterexamples. Among the enumerative testing libraries, the "competition" is between leancheck and smallcheck. For, me the important differences are Enumeration of algebraic data types (ADTs), via built-in serial combinators, e.g., cons0 Leaf \/ cons2 Branch * leancheck enumerate by size (number of nodes in tree) * smallcheck enumerates by depth (longest path in tree) I (much!) prefer "by size" because this delays combinatorial explosion somewhat. Automated Listable/Serial instances for ADTs * leancheck uses template-haskell * smallcheck uses DeriveGeneric Here I would actually prefer DeriveGeneric. But this certainly could be added to leancheck? Oh, and leancheck does not use fancy typeclassery (smallcheck needs Multiparamtypeclasses and Monads just to declare a Serial instance) and it is also easier to get at the list of counterexamples. - J.W. From petr.mvd at gmail.com Mon Sep 10 16:16:21 2018 From: petr.mvd at gmail.com (=?UTF-8?B?UGV0ciBQdWRsw6Fr?=) Date: Mon, 10 Sep 2018 18:16:21 +0200 Subject: [Haskell-cafe] modules as implicit data structures In-Reply-To: References: Message-ID: Thank you everyone, these are some very interesting pointers to explore! Petr čt 6. 9. 2018 v 15:01 odesílatel Sven Panne napsal: > Am Do., 6. Sep. 2018 um 11:43 Uhr schrieb Petr Pudlák >: > >> [...] Has some language explored this idea of making modules explicit as >> language-level objects? It seems that there could be some interesting >> possibilities, such as: [...] >> > > You might want to have a look at Standard ML's signatures, structures and > functors, they are probably what you're thinking about. > > Cheers, > S. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From cdsmith at gmail.com Mon Sep 10 16:46:43 2018 From: cdsmith at gmail.com (Chris Smith) Date: Mon, 10 Sep 2018 12:46:43 -0400 Subject: [Haskell-cafe] Get-together at ICFP about Haskell in K-12 education In-Reply-To: References: Message-ID: Looking over the responses so far, it's clear that this "Haskell in K-12" group should plan to meet for dinner on Monday at 7:00 pm. I am editing the interest form to specify the time. Please still add your name if you are interested in joining us but haven't responded yet. I'll send out a more complete invitation once I have a chance to work out a few more details. Thanks, Chris On Sat, Sep 8, 2018 at 11:53 PM Chris Smith wrote: > Hello Haskell community! > > I've been floating an idea around about hosting a get-together in St. > Louis for anyone interested in Haskell at the K-12 (pre-university) level. > Some of you know that this has been a big passion of mine for the last 8 > years or so. I'd love to discuss with others while a lot of us are in the > same place -- whether you're interested in organized teaching, or just > teaching your own kids, or just curious about the idea. > > I've put together this form to collect RSVPs and availability. > > Interest Form > > > > Thanks, > Chris Smith > -------------- next part -------------- An HTML attachment was scrubbed... URL: From judah.jacobson at gmail.com Mon Sep 10 19:12:31 2018 From: judah.jacobson at gmail.com (Judah Jacobson) Date: Mon, 10 Sep 2018 12:12:31 -0700 Subject: [Haskell-cafe] ANN: proto-lens-0.4.0.0 Message-ID: I'm pleased to announce the release of proto-lens-0.4.0.0. The library provides an API for protocol buffers, a language-independent binary file format. Cabal and Stack projects can use proto-lens to automatically generate Haskell source bindings from the original protocol buffer specifications. Some significant changes in this new release include: - Simplifying the overloaded lens instances to improve readability and type error messages (along with the associated release of lens-labels-0.3.0.0) - Switching to a custom class for default messages - Hiding the internals of generated message types - Making the Show instances more readable and concise - Splitting the Cabal support into finer-grained packages, for better integration with Nix and Bazel For general library documentation and tutorials: https://google.github.io/proto-lens/ For assistance in migrating from earlier versions: https://github.com/google/proto-lens/wiki/Migration-Guide For detailed changelogs: https://hackage.haskell.org/package/proto-lens/changelog https://hackage.haskell.org/package/proto-lens-protoc/changelog Best, -Judah -------------- next part -------------- An HTML attachment was scrubbed... URL: From roehst at gmail.com Tue Sep 11 00:54:20 2018 From: roehst at gmail.com (Rodrigo Stevaux) Date: Mon, 10 Sep 2018 21:54:20 -0300 Subject: [Haskell-cafe] Difference between `type` and `newtype` in type checking Message-ID: Hi, I was studying this post ( http://www.haskellforall.com/2012/12/the-continuation-monad.html) on CPS and I tried the following code: module Main where newtype Cont r a = Cont { runCont :: (a -> r) -> r } onInput :: Cont (IO ()) String onInput f = do s <- getLine f s onInput f main :: IO () main = onInput print I fails to compile: "Couldn't match expected type ‘Cont (IO ()) String’ with actual type ‘(String -> IO a0) -> IO b0’ • The equation(s) for ‘onInput’ have one argument, but its type ‘Cont (IO ()) String’ has none" But I thought Cont a b would be expanded to (b -> a) -> a so that Cont (IO ()) String became (String -> IO ()) -> IO (), and if I give that type using `type` instead of `newtype`, it does type-check: type Cont r a = (a -> r) -> r What am I missing here about Haskell? thanks folks! -------------- next part -------------- An HTML attachment was scrubbed... URL: From ryan.reich at gmail.com Tue Sep 11 01:09:26 2018 From: ryan.reich at gmail.com (Ryan Reich) Date: Mon, 10 Sep 2018 18:09:26 -0700 Subject: [Haskell-cafe] Difference between `type` and `newtype` in type checking In-Reply-To: References: Message-ID: Although a newtype has the same representation at runtime as the type inside, you still have to use the constructor as for a 'data' type. This makes sense, as it is a *new type*, whose values must be distinguished from those of the inner type, and the only way to do that is if they are decorated with a constructor. On Mon, Sep 10, 2018 at 5:55 PM Rodrigo Stevaux wrote: > Hi, I was studying this post ( > http://www.haskellforall.com/2012/12/the-continuation-monad.html) on CPS > and I tried the following code: > > module Main where > > newtype Cont r a = Cont { runCont :: (a -> r) -> r } > > onInput :: Cont (IO ()) String > onInput f = do s <- getLine > f s onInput f > > main :: IO () main = onInput print > > I fails to compile: > > "Couldn't match expected type ‘Cont (IO ()) String’ with actual type > ‘(String -> IO a0) -> IO b0’ • The equation(s) for ‘onInput’ have one > argument, but its type ‘Cont (IO ()) String’ has none" > > But I thought Cont a b would be expanded to (b -> a) -> a so that Cont (IO > ()) String became (String -> IO ()) -> IO (), and if I give that type using > `type` instead of `newtype`, it does type-check: > > type Cont r a = (a -> r) -> r > > What am I missing here about Haskell? > > thanks folks! > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From yotam2206 at gmail.com Tue Sep 11 06:50:42 2018 From: yotam2206 at gmail.com (Yotam Ohad) Date: Tue, 11 Sep 2018 09:50:42 +0300 Subject: [Haskell-cafe] Adding a custom lib to stack project Message-ID: Hi, I made a lib from a cpp project with one function: BOOL Foo(LPCSTR bar) In the stack project I added the .lib file's folder to the extra-lib-dirs/extra-include-dirs and then, in main: {-# LANGUAGE ForeignFunctionInterface #-} module Main where import System.Win32.Types import Foreign.C.String main :: IO () main = do withCString "bar" c_Foo putStrLn "success" foreign import ccall "bindings.lib Foo" c_Foo :: LPCSTR -> IO BOOL When building I get the following error Building all executables for `tape' once. After a successful build of all of them, only specified executables will be rebuilt. tape-0.1.0.0: build (lib + exe) Preprocessing library for tape-0.1.0.0.. Building library for tape-0.1.0.0.. ignoring (possibly broken) abi-depends field for packages Preprocessing executable 'tape-exe' for tape-0.1.0.0.. Building executable 'tape-exe' for tape-0.1.0.0.. Linking .stack-work\dist\7d103d30\build\tape-exe\tape-exe.exe ... .stack-work\dist\7d103d30\build\tape-exe\tape-exe-tmp\Main.o:fake:(.text+0x102): undefined reference to `CreateDebuggedProcess' collect2.exe: error: ld returned 1 exit status `gcc.exe' failed in phase `Linker'. (Exit code: 1) -- While building custom Setup.hs for package tape-0.1.0.0 using: C:\sr\setup-exe-cache\x86_64-windows\Cabal-simple_Z6RU0evB_2.2.0.1_ghc-8.4.3.exe --builddir=.stack-work\dist\7d103d30 build lib:tape exe:tape-exe --ghc-options " -ddump-hi -ddump-to-file -fdiagnostics-color=always" Process exited with code: ExitFailure What am I doing wrong? Yotam -------------- next part -------------- An HTML attachment was scrubbed... URL: From lonetiger at gmail.com Tue Sep 11 07:03:03 2018 From: lonetiger at gmail.com (Phyx) Date: Tue, 11 Sep 2018 08:03:03 +0100 Subject: [Haskell-cafe] Adding a custom lib to stack project In-Reply-To: References: Message-ID: Hi, I assume CreateDebuggedProcess is defined in bindings.lib? You need to also add extra-libraries: bindings Also keep in mind that C++ has a different name mangling than C, so if your function is in a class you'll need to use the proper name for it. nm -g bindings.lib would show the actual name. Tamar. On Tue, Sep 11, 2018, 07:51 Yotam Ohad wrote: > Hi, > > I made a lib from a cpp project with one function: BOOL Foo(LPCSTR bar) > In the stack project I added the .lib file's folder to the > extra-lib-dirs/extra-include-dirs and then, in main: > > {-# LANGUAGE ForeignFunctionInterface #-} > > module Main where > > import System.Win32.Types > import Foreign.C.String > > main :: IO () > main = do > withCString "bar" c_Foo > putStrLn "success" > > foreign import ccall "bindings.lib Foo" > c_Foo :: LPCSTR -> IO BOOL > > When building I get the following error > Building all executables for `tape' once. After a successful build of all > of them, only specified executables will be rebuilt. > tape-0.1.0.0: build (lib + exe) > Preprocessing library for tape-0.1.0.0.. > Building library for tape-0.1.0.0.. > ignoring (possibly broken) abi-depends field for packages > Preprocessing executable 'tape-exe' for tape-0.1.0.0.. > Building executable 'tape-exe' for tape-0.1.0.0.. > Linking .stack-work\dist\7d103d30\build\tape-exe\tape-exe.exe ... > .stack-work\dist\7d103d30\build\tape-exe\tape-exe-tmp\Main.o:fake:(.text+0x102): > undefined reference to `CreateDebuggedProcess' > collect2.exe: error: ld returned 1 exit status > `gcc.exe' failed in phase `Linker'. (Exit code: 1) > > -- While building custom Setup.hs for package tape-0.1.0.0 using: > > C:\sr\setup-exe-cache\x86_64-windows\Cabal-simple_Z6RU0evB_2.2.0.1_ghc-8.4.3.exe > --builddir=.stack-work\dist\7d103d30 build lib:tape exe:tape-exe > --ghc-options " -ddump-hi -ddump-to-file -fdiagnostics-color=always" > Process exited with code: ExitFailure > > What am I doing wrong? > Yotam > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From konn.jinro at gmail.com Tue Sep 11 07:05:39 2018 From: konn.jinro at gmail.com (Hiromi ISHII) Date: Tue, 11 Sep 2018 16:05:39 +0900 Subject: [Haskell-cafe] modules as implicit data structures In-Reply-To: References: Message-ID: <083EFA83-3735-4348-B0F9-F72884B3B20E@gmail.com> Hi Petr, I think Agda's module system is something similar to your conception. In Agda, you can treat modules and records interchangably. > 2018/09/11 1:16、Petr Pudlák のメール: > > Thank you everyone, these are some very interesting pointers to explore! > > Petr > > čt 6. 9. 2018 v 15:01 odesílatel Sven Panne napsal: > Am Do., 6. Sep. 2018 um 11:43 Uhr schrieb Petr Pudlák : > [...] Has some language explored this idea of making modules explicit as language-level objects? It seems that there could be some interesting possibilities, such as: [...] > > You might want to have a look at Standard ML's signatures, structures and functors, they are probably what you're thinking about. > > Cheers, > S. > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. ----- 石井 大海 --------------------------- konn.jinro at gmail.com 筑波大学数理物質科学研究科 数学専攻 博士後期課程三年 ---------------------------------------------- -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 488 bytes Desc: Message signed with OpenPGP URL: From yotam2206 at gmail.com Tue Sep 11 07:31:59 2018 From: yotam2206 at gmail.com (Yotam Ohad) Date: Tue, 11 Sep 2018 10:31:59 +0300 Subject: [Haskell-cafe] Adding a custom lib to stack project In-Reply-To: References: Message-ID: Thanks for the reply. I've added extra-libraries to package.yaml. Now though, I'm getting a missing C library error (although `stack path --extra-library-dirs` prints the directory of the .lib file) Yotam ‫בתאריך יום ג׳, 11 בספט׳ 2018 ב-10:03 מאת ‪Phyx‬‏ <‪lonetiger at gmail.com‬‏>:‬ > Hi, > > I assume CreateDebuggedProcess is defined in bindings.lib? You need to > also add extra-libraries: bindings > > Also keep in mind that C++ has a different name mangling than C, so if > your function is in a class you'll need to use the proper name for it. > > nm -g bindings.lib would show the actual name. > > Tamar. > > On Tue, Sep 11, 2018, 07:51 Yotam Ohad wrote: > >> Hi, >> >> I made a lib from a cpp project with one function: BOOL Foo(LPCSTR bar) >> In the stack project I added the .lib file's folder to the >> extra-lib-dirs/extra-include-dirs and then, in main: >> >> {-# LANGUAGE ForeignFunctionInterface #-} >> >> module Main where >> >> import System.Win32.Types >> import Foreign.C.String >> >> main :: IO () >> main = do >> withCString "bar" c_Foo >> putStrLn "success" >> >> foreign import ccall "bindings.lib Foo" >> c_Foo :: LPCSTR -> IO BOOL >> >> When building I get the following error >> Building all executables for `tape' once. After a successful build of all >> of them, only specified executables will be rebuilt. >> tape-0.1.0.0: build (lib + exe) >> Preprocessing library for tape-0.1.0.0.. >> Building library for tape-0.1.0.0.. >> ignoring (possibly broken) abi-depends field for packages >> Preprocessing executable 'tape-exe' for tape-0.1.0.0.. >> Building executable 'tape-exe' for tape-0.1.0.0.. >> Linking .stack-work\dist\7d103d30\build\tape-exe\tape-exe.exe ... >> .stack-work\dist\7d103d30\build\tape-exe\tape-exe-tmp\Main.o:fake:(.text+0x102): >> undefined reference to `CreateDebuggedProcess' >> collect2.exe: error: ld returned 1 exit status >> `gcc.exe' failed in phase `Linker'. (Exit code: 1) >> >> -- While building custom Setup.hs for package tape-0.1.0.0 using: >> >> C:\sr\setup-exe-cache\x86_64-windows\Cabal-simple_Z6RU0evB_2.2.0.1_ghc-8.4.3.exe >> --builddir=.stack-work\dist\7d103d30 build lib:tape exe:tape-exe >> --ghc-options " -ddump-hi -ddump-to-file -fdiagnostics-color=always" >> Process exited with code: ExitFailure >> >> What am I doing wrong? >> Yotam >> >> _______________________________________________ >> Haskell-Cafe mailing list >> To (un)subscribe, modify options or view archives go to: >> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe >> Only members subscribed via the mailman list are allowed to post. > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From lonetiger at gmail.com Tue Sep 11 07:50:36 2018 From: lonetiger at gmail.com (Phyx) Date: Tue, 11 Sep 2018 08:50:36 +0100 Subject: [Haskell-cafe] Adding a custom lib to stack project In-Reply-To: References: Message-ID: I don't use stack so can't help you much there. If you can get it to put ghc in verbose mode you can see what it's passing to the compiler. If the error is coming from stack itself then you'll need to figure out how stack tests for the library. On Tue, Sep 11, 2018, 08:32 Yotam Ohad wrote: > Thanks for the reply. > I've added extra-libraries to package.yaml. > Now though, I'm getting a missing C library error (although `stack path > --extra-library-dirs` prints the directory of the .lib file) > > Yotam > > ‫בתאריך יום ג׳, 11 בספט׳ 2018 ב-10:03 מאת ‪Phyx‬‏ <‪lonetiger at gmail.com > ‬‏>:‬ > >> Hi, >> >> I assume CreateDebuggedProcess is defined in bindings.lib? You need to >> also add extra-libraries: bindings >> >> Also keep in mind that C++ has a different name mangling than C, so if >> your function is in a class you'll need to use the proper name for it. >> >> nm -g bindings.lib would show the actual name. >> >> Tamar. >> >> On Tue, Sep 11, 2018, 07:51 Yotam Ohad wrote: >> >>> Hi, >>> >>> I made a lib from a cpp project with one function: BOOL Foo(LPCSTR bar) >>> In the stack project I added the .lib file's folder to the >>> extra-lib-dirs/extra-include-dirs and then, in main: >>> >>> {-# LANGUAGE ForeignFunctionInterface #-} >>> >>> module Main where >>> >>> import System.Win32.Types >>> import Foreign.C.String >>> >>> main :: IO () >>> main = do >>> withCString "bar" c_Foo >>> putStrLn "success" >>> >>> foreign import ccall "bindings.lib Foo" >>> c_Foo :: LPCSTR -> IO BOOL >>> >>> When building I get the following error >>> Building all executables for `tape' once. After a successful build of >>> all of them, only specified executables will be rebuilt. >>> tape-0.1.0.0: build (lib + exe) >>> Preprocessing library for tape-0.1.0.0.. >>> Building library for tape-0.1.0.0.. >>> ignoring (possibly broken) abi-depends field for packages >>> Preprocessing executable 'tape-exe' for tape-0.1.0.0.. >>> Building executable 'tape-exe' for tape-0.1.0.0.. >>> Linking .stack-work\dist\7d103d30\build\tape-exe\tape-exe.exe ... >>> .stack-work\dist\7d103d30\build\tape-exe\tape-exe-tmp\Main.o:fake:(.text+0x102): >>> undefined reference to `CreateDebuggedProcess' >>> collect2.exe: error: ld returned 1 exit status >>> `gcc.exe' failed in phase `Linker'. (Exit code: 1) >>> >>> -- While building custom Setup.hs for package tape-0.1.0.0 using: >>> >>> C:\sr\setup-exe-cache\x86_64-windows\Cabal-simple_Z6RU0evB_2.2.0.1_ghc-8.4.3.exe >>> --builddir=.stack-work\dist\7d103d30 build lib:tape exe:tape-exe >>> --ghc-options " -ddump-hi -ddump-to-file -fdiagnostics-color=always" >>> Process exited with code: ExitFailure >>> >>> What am I doing wrong? >>> Yotam >>> >>> _______________________________________________ >>> Haskell-Cafe mailing list >>> To (un)subscribe, modify options or view archives go to: >>> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe >>> Only members subscribed via the mailman list are allowed to post. >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From brucker at spamfence.net Tue Sep 11 09:15:32 2018 From: brucker at spamfence.net (Achim D. Brucker) Date: Tue, 11 Sep 2018 10:15:32 +0100 Subject: [Haskell-cafe] Call for Papers: Postproceedings for ThEdu'18 by EPTCS Message-ID: <20180911091532.vmufzl54uf4kltvn@kandagawa.home.brucker.ch> (Apologies for duplicates) Open Call for Papers ************************************************************************** Postproceedings for ThEdu'18 by EPTCS Theorem proving components for Educational software http://www.uc.pt/en/congressos/thedu/thedu18 ************************************************************************** Workshop ThEdu at FLoC Federated Logic Conference 2018 http://www.floc2018.org/ ************************************************************************** THedu'18 Postproceedings: ThEdu's programme comprised seven contributions, presented also in the webpages. Now postproceedings are planned to collect the contributions upgraded to full papers. The contributions' topics are diverse according to ThEdu's scope, and this is a call open for everyone, also those who did not participate in the workshop. All papers will undergo review according to EPTCS standards. THedu'18 Scope: Computer Theorem Proving is becoming a paradigm as well as a technological base for a new generation of educational software in science, technology, engineering and mathematics. The workshop brings together experts in automated deduction with experts in education in order to further clarify the shape of the new software generation and to discuss existing systems. Topics of interest include: * methods of automated deduction applied to checking students' input; * methods of automated deduction applied to prove post-conditions for particular problem solutions; * combinations of deduction and computation enabling systems to propose next steps; * automated provers specific for dynamic geometry systems; * proof and proving in mathematics education. Important Dates * 2nd call for papers: 10 Sep 2018 * Submission (full papers): 18 Nov 2018 * Notification of acceptance: 17 Dec 2018 * Revised papers due: 21 Jan 2019 Submission We welcome submission of papers presenting original unpublished work which is not been submitted for publication elsewhere. The authors should comply with the "instructions for authors", LaTeX style files and accept the "Non-exclusive license to distribute" of EPTCS: Instructions for authors (http://info.eptcs.org/) LaTeX style file and formatting instructions (http://style.eptcs.org/) Copyright (http://copyright.eptcs.org/) Papers should be submitted via easychair, https://easychair.org/conferences/?conf=thedu18. In case the contributions finally do not reach the standards of EPTCS in number, there will be an alternative to publish as a techreport at CISUC https://www.cisuc.uc.pt/publications. Program Committee Francisco Botana, University of Vigo at Pontevedra, Spain Roman Hašek, University of South Bohemia, Czech Republic Filip Maric, University of Belgrade, Serbia Walther Neuper, Graz University of Technology, Austria (co-chair) Pavel Pech, University of South Bohemia, Czech Republic Pedro Quaresma, University of Coimbra, Portugal (co-chair) Vanda Santos, CISUC, Portugal Wolfgang Schreiner, Johannes Kepler University, Austria -- Dr. Achim D. Brucker | Software Assurance & Security | University of Sheffield https://www.brucker.ch | https://logicalhacking.com/blog @adbrucker | @logicalhacking From lonetiger at gmail.com Tue Sep 11 17:43:15 2018 From: lonetiger at gmail.com (lonetiger at gmail.com) Date: Tue, 11 Sep 2018 18:43:15 +0100 Subject: [Haskell-cafe] Adding a custom lib to stack project In-Reply-To: References: Message-ID: <5b97feb2.1c69fb81.43d1e.9353@mx.google.com> Hi Yotam, Have you tried with cabal? If that gives the same error can you paste your cabal file somewhere and I can take a look for you. Cheers, Tamar From: Phyx Sent: Tuesday, September 11, 2018 08:50 To: Yotam Ohad Cc: haskell-cafe at haskell.org Subject: Re: [Haskell-cafe] Adding a custom lib to stack project I don't use stack so can't help you much there. If you can get it to put ghc in verbose mode you can see what it's passing to the compiler. If the error is coming from stack itself then you'll need to figure out how stack tests for the library.  On Tue, Sep 11, 2018, 08:32 Yotam Ohad wrote: Thanks for the reply. I've added extra-libraries to package.yaml. Now though, I'm getting a missing C library error (although `stack path --extra-library-dirs` prints the directory of the .lib file) Yotam ‫בתאריך יום ג׳, 11 בספט׳ 2018 ב-10:03 מאת ‪Phyx‬‏ <‪lonetiger at gmail.com‬‏>:‬ Hi,  I assume CreateDebuggedProcess is defined in bindings.lib? You need to also add extra-libraries: bindings  Also keep in mind that C++ has a different name mangling than C, so if your function is in a class you'll need to use the proper name for it.  nm -g bindings.lib would show the actual name.  Tamar.  On Tue, Sep 11, 2018, 07:51 Yotam Ohad wrote: Hi, I made a lib from a  cpp project with one function: BOOL Foo(LPCSTR bar) In the stack project I added the .lib file's folder to the extra-lib-dirs/extra-include-dirs and then, in main: {-# LANGUAGE ForeignFunctionInterface #-} module Main where import System.Win32.Types import Foreign.C.String main :: IO () main = do     withCString "bar" c_Foo     putStrLn "success" foreign import ccall "bindings.lib Foo"     c_Foo :: LPCSTR -> IO BOOL When building I get the following error Building all executables for `tape' once. After a successful build of all of them, only specified executables will be rebuilt. tape-0.1.0.0: build (lib + exe) Preprocessing library for tape-0.1.0.0.. Building library for tape-0.1.0.0.. ignoring (possibly broken) abi-depends field for packages Preprocessing executable 'tape-exe' for tape-0.1.0.0.. Building executable 'tape-exe' for tape-0.1.0.0.. Linking .stack-work\dist\7d103d30\build\tape-exe\tape-exe.exe ... .stack-work\dist\7d103d30\build\tape-exe\tape-exe-tmp\Main.o:fake:(.text+0x102): undefined reference to `CreateDebuggedProcess' collect2.exe: error: ld returned 1 exit status `gcc.exe' failed in phase `Linker'. (Exit code: 1) --  While building custom Setup.hs for package tape-0.1.0.0 using:       C:\sr\setup-exe-cache\x86_64-windows\Cabal-simple_Z6RU0evB_2.2.0.1_ghc-8.4.3.exe --builddir=.stack-work\dist\7d103d30 build lib:tape exe:tape-exe --ghc-options " -ddump-hi -ddump-to-file -fdiagnostics-color=always"     Process exited with code: ExitFailure What am I doing wrong? Yotam _______________________________________________ Haskell-Cafe mailing list To (un)subscribe, modify options or view archives go to: http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From yotam2206 at gmail.com Tue Sep 11 17:48:39 2018 From: yotam2206 at gmail.com (Yotam Ohad) Date: Tue, 11 Sep 2018 20:48:39 +0300 Subject: [Haskell-cafe] Adding a custom lib to stack project In-Reply-To: <5b97feb2.1c69fb81.43d1e.9353@mx.google.com> References: <5b97feb2.1c69fb81.43d1e.9353@mx.google.com> Message-ID: Hi Tamar, I managed to build at the end by adding the c source files with `c-sources:` in package.yaml Thanks for your help Yotam ‫בתאריך יום ג׳, 11 בספט׳ 2018 ב-20:43 מאת <‪lonetiger at gmail.com‬‏>:‬ > > > Hi Yotam, > > > > Have you tried with cabal? If that gives the same error can you paste your > cabal file somewhere and I can take a look for you. > > > > Cheers, > > Tamar > > > > *From: *Phyx > *Sent: *Tuesday, September 11, 2018 08:50 > *To: *Yotam Ohad > *Cc: *haskell-cafe at haskell.org > *Subject: *Re: [Haskell-cafe] Adding a custom lib to stack project > > > > I don't use stack so can't help you much there. If you can get it to put > ghc in verbose mode you can see what it's passing to the compiler. If the > error is coming from stack itself then you'll need to figure out how stack > tests for the library. > > On Tue, Sep 11, 2018, 08:32 Yotam Ohad wrote: > > Thanks for the reply. > > I've added extra-libraries to package.yaml. > > Now though, I'm getting a missing C library error (although `stack path > --extra-library-dirs` prints the directory of the .lib file) > > > > Yotam > > > > ‫בתאריך יום ג׳, 11 בספט׳ 2018 ב-10:03 מאת ‪Phyx‏ <‪lonetiger at gmail.com‏>: > > Hi, > > > > I assume CreateDebuggedProcess is defined in bindings.lib? You need to > also add extra-libraries: bindings > > > > Also keep in mind that C++ has a different name mangling than C, so if > your function is in a class you'll need to use the proper name for it. > > > > nm -g bindings.lib would show the actual name. > > > > Tamar. > > > > On Tue, Sep 11, 2018, 07:51 Yotam Ohad wrote: > > Hi, > > I made a lib from a cpp project with one function: BOOL Foo(LPCSTR bar) > In the stack project I added the .lib file's folder to the > extra-lib-dirs/extra-include-dirs and then, in main: > > {-# LANGUAGE ForeignFunctionInterface #-} > > module Main where > > import System.Win32.Types > import Foreign.C.String > > main :: IO () > main = do > withCString "bar" c_Foo > putStrLn "success" > > foreign import ccall "bindings.lib Foo" > c_Foo :: LPCSTR -> IO BOOL > > When building I get the following error > Building all executables for `tape' once. After a successful build of all > of them, only specified executables will be rebuilt. > tape-0.1.0.0: build (lib + exe) > Preprocessing library for tape-0.1.0.0.. > Building library for tape-0.1.0.0.. > ignoring (possibly broken) abi-depends field for packages > Preprocessing executable 'tape-exe' for tape-0.1.0.0.. > Building executable 'tape-exe' for tape-0.1.0.0.. > Linking .stack-work\dist\7d103d30\build\tape-exe\tape-exe.exe ... > .stack-work\dist\7d103d30\build\tape-exe\tape-exe-tmp\Main.o:fake:(.text+0x102): > undefined reference to `CreateDebuggedProcess' > collect2.exe: error: ld returned 1 exit status > `gcc.exe' failed in phase `Linker'. (Exit code: 1) > > -- While building custom Setup.hs for package tape-0.1.0.0 using: > > C:\sr\setup-exe-cache\x86_64-windows\Cabal-simple_Z6RU0evB_2.2.0.1_ghc-8.4.3.exe > --builddir=.stack-work\dist\7d103d30 build lib:tape exe:tape-exe > --ghc-options " -ddump-hi -ddump-to-file -fdiagnostics-color=always" > Process exited with code: ExitFailure > > > > What am I doing wrong? > > Yotam > > > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From roehst at gmail.com Wed Sep 12 01:50:41 2018 From: roehst at gmail.com (Rodrigo Stevaux) Date: Tue, 11 Sep 2018 22:50:41 -0300 Subject: [Haskell-cafe] Is it possible to change the environment (reader) in applicative style? Message-ID: It is easy to read an environment in applicative style, ie: type Env = [(String, Int)] data Term = Add Term Term | Number Int | Var String deriving Show eval :: Term -> Env -> Int eval (Add a b) = (+) <$> eval a <*> eval b eval (Var name) = fetch name eval (Number i) = pure i fetch :: String -> Env -> Int fetch name = fromJust . lookup name But can the eval function change the Env being passed, as to implement a "let" operation, without using monads? I tried I lot but ultimately I resorted to (>>=) in the function monad: bind f k = \r -> k (f r) r I do not think so, because in applicative style each operand can have an effect (reading the environment) but can not affect other operands (including the next ones), i.e., there is no notion of sequencing in applicatives Is this reasoning right? From tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk Wed Sep 12 06:25:13 2018 From: tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk (Tom Ellis) Date: Wed, 12 Sep 2018 07:25:13 +0100 Subject: [Haskell-cafe] Is it possible to change the environment (reader) in applicative style? In-Reply-To: References: Message-ID: <20180912062513.6jzulp6u7iu54fpr@weber> On Tue, Sep 11, 2018 at 10:50:41PM -0300, Rodrigo Stevaux wrote: > It is easy to read an environment in applicative style, ie: > > type Env = [(String, Int)] > data Term = Add Term Term | Number Int | Var String deriving Show > eval :: Term -> Env -> Int > eval (Add a b) = (+) <$> eval a <*> eval b > eval (Var name) = fetch name > eval (Number i) = pure i > > fetch :: String -> Env -> Int > fetch name = fromJust . lookup name > > But can the eval function change the Env being passed, as to implement > a "let" operation, without using monads? I tried I lot but ultimately > I resorted to (>>=) in the function monad: > > bind f k = \r -> k (f r) r I'm confused by this. Your `bind` doesn't seem to change the Env being passed. Can you explain? > I do not think so, because in applicative style each operand can have > an effect (reading the environment) but can not affect other operands > (including the next ones), i.e., there is no notion of sequencing in > applicatives > > Is this reasoning right? Your conclusion may be right but I don't think that your reasoning is. Certainly an applicative action can affect the subsequent operations. State is Applicative after all! Tom From ivanperezdominguez at gmail.com Wed Sep 12 08:09:51 2018 From: ivanperezdominguez at gmail.com (Ivan Perez) Date: Wed, 12 Sep 2018 04:09:51 -0400 Subject: [Haskell-cafe] Is it possible to change the environment (reader) in applicative style? In-Reply-To: References: Message-ID: On 11 September 2018 at 21:50, Rodrigo Stevaux wrote: > It is easy to read an environment in applicative style, ie: > > type Env = [(String, Int)] > data Term = Add Term Term | Number Int | Var String deriving Show > eval :: Term -> Env -> Int > eval (Add a b) = (+) <$> eval a <*> eval b > eval (Var name) = fetch name > eval (Number i) = pure i > > fetch :: String -> Env -> Int > fetch name = fromJust . lookup name > > But can the eval function change the Env being passed, as to implement > a "let" operation, without using monads? I tried I lot but ultimately > I resorted to (>>=) in the function monad: > > bind f k = \r -> k (f r) r > I think what you mean is something like: can we extend Term with a let binding expression and implement eval using applicative interface without (>>=)? I think we can, and it's a bit awkward, but possible, because of the Reader monad. A trivial way of introducing let that does not manifest the issues you point out is data Term = Add Term Term | Number Int | Var String | Let String Int Term You can then implement the case for eval with eval (Let s v t) = eval t . update s v where the function update simply updates a value in the associative list. A simple implementation is: update :: Ord a => a -> b -> [(a, b)] -> [(a, b)] update s v = nubBy eqFst . insertBy cmpFst (s, v) where eqFst x y = (==) (fst x) (fst y) cmpFst x y = compare (fst x) (fst y) Of course, this does not need the monad interface, but it does not really need the applicative interface to evaluate the term either (except indirectly in eval t). Perhaps a more interesting alternative is: data Term = ... | LetT String Term Term where the other cases in Term remain the same. Now you need to eval the first term to change the environment, which is, I guess, what you wanted? You can do this combining composition with applicative: eval (LetT s t1 t2) = eval t2 . (update' <*> pure s <*> eval t1) where update' :: Env -> String -> Int -> Env update' e s v = update s v e And a test (which is equivalent to let b = a + 8 in b + 1): *Main> eval (LetT "b" (Add (Number 8) (Var "a")) (Add (Number 1) (Var "b"))) [("a", 7)] 16 > > I do not think so, because in applicative style each operand can have > an effect (reading the environment) but can not affect other operands > (including the next ones), i.e., there is no notion of sequencing in > applicatives > Is this reasoning right? > As Tom pointed out, not 100%, not generally, I think. This seems to be specific to the reader monad. All the best, Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From roehst at gmail.com Wed Sep 12 15:12:42 2018 From: roehst at gmail.com (Rodrigo Stevaux) Date: Wed, 12 Sep 2018 12:12:42 -0300 Subject: [Haskell-cafe] Is it possible to change the environment (reader) in applicative style? In-Reply-To: References: Message-ID: Yes, the example with Let Name Term Term is what I was experimenting with. About "eval t2 . (update' <*> pure s <*> eval t1)": Well I was following applicative style as "Applicative Programming with Effects" by Conor McBride I did not consider this line applicative because of the (.) operator; I am trying to get away with just `pure` and `<*>` -- to be more precise, the K and S combinators. So the question becomes: can we implement the environment modification operation without resorting to function composition? Em qua, 12 de set de 2018 às 05:10, Ivan Perez escreveu: > > On 11 September 2018 at 21:50, Rodrigo Stevaux wrote: >> >> It is easy to read an environment in applicative style, ie: >> >> type Env = [(String, Int)] >> data Term = Add Term Term | Number Int | Var String deriving Show >> eval :: Term -> Env -> Int >> eval (Add a b) = (+) <$> eval a <*> eval b >> eval (Var name) = fetch name >> eval (Number i) = pure i >> >> fetch :: String -> Env -> Int >> fetch name = fromJust . lookup name >> >> But can the eval function change the Env being passed, as to implement >> a "let" operation, without using monads? I tried I lot but ultimately >> I resorted to (>>=) in the function monad: >> >> bind f k = \r -> k (f r) r > > > I think what you mean is something like: can we extend Term with a let binding expression and implement eval using applicative interface without (>>=)? > > I think we can, and it's a bit awkward, but possible, because of the Reader monad. > > A trivial way of introducing let that does not manifest the issues you point out is > > data Term = Add Term Term | Number Int | Var String | Let String Int Term > > You can then implement the case for eval with > > eval (Let s v t) = eval t . update s v > > where the function update simply updates a value in the associative list. A simple implementation is: > > update :: Ord a => a -> b -> [(a, b)] -> [(a, b)] > update s v = nubBy eqFst . insertBy cmpFst (s, v) > where > eqFst x y = (==) (fst x) (fst y) > cmpFst x y = compare (fst x) (fst y) > > Of course, this does not need the monad interface, but it does not really need the applicative interface to evaluate the term either (except indirectly in eval t). > > Perhaps a more interesting alternative is: > > data Term = ... | LetT String Term Term > > where the other cases in Term remain the same. Now you need to eval the first term to change the environment, which is, I guess, what you wanted? > > You can do this combining composition with applicative: > > eval (LetT s t1 t2) = eval t2 . (update' <*> pure s <*> eval t1) > where > update' :: Env -> String -> Int -> Env > update' e s v = update s v e > > And a test (which is equivalent to let b = a + 8 in b + 1): > > *Main> eval (LetT "b" (Add (Number 8) (Var "a")) (Add (Number 1) (Var "b"))) [("a", 7)] > 16 > > >> >> >> I do not think so, because in applicative style each operand can have >> an effect (reading the environment) but can not affect other operands >> (including the next ones), i.e., there is no notion of sequencing in >> applicatives >> Is this reasoning right? > > > As Tom pointed out, not 100%, not generally, I think. This seems to be specific to the reader monad. > > All the best, > > Ivan > From monkleyon at gmail.com Wed Sep 12 16:57:16 2018 From: monkleyon at gmail.com (MarLinn) Date: Wed, 12 Sep 2018 18:57:16 +0200 Subject: [Haskell-cafe] Is it possible to change the environment (reader) in applicative style? In-Reply-To: References: Message-ID: <09069bbb-4207-da49-3eb3-c0401fd6422e@gmail.com> Hi. > Yes, the example with Let Name Term Term is what I was experimenting with. > > About "eval t2 . (update' <*> pure s <*> eval t1)": > > Well I was following applicative style as "Applicative Programming > with Effects" by Conor McBride > > I did not consider this line applicative because of the (.) operator; > > I am trying to get away with just `pure` and `<*>` -- to be more > precise, the K and S combinators. > > So the question becomes: can we implement the environment modification > operation without resorting to function composition? Note that for (->), (<$>)= (.). Thus eval t2 . bracket ≡ eval t2 <$> bracket Note also that by definition (<$>) = (<*>) . pure and therefore eval t2 <$> bracket ≡ pure (eval t2) <*> bracket So more precisely eval t2 . (update' <*> pure s <*> eval t1) ≡ pure (eval t2) <*> (update' <*> pure s <*> eval t1) which, as per your requirements, uses only pure and (<*>) (plus function application and brackets). Is this what you where going for? If not I think we would need more precisely defined requirements to help further. Cheers. -------------- next part -------------- An HTML attachment was scrubbed... URL: From olf at aatal-apotheke.de Wed Sep 12 19:28:05 2018 From: olf at aatal-apotheke.de (Olaf Klinke) Date: Wed, 12 Sep 2018 21:28:05 +0200 Subject: [Haskell-cafe] Is it possible to change the environment (reader) in applicative style? Message-ID: Neither >>= nor 'return' nor 'join' for the reader monad change the environment, hence the name "reader". Thus I'd chime in and claim it is not possible using the Applicative or Monad combinators alone. After all, the reader monad is ignorant of what type it is reading. In other words, if the type class combinators alone could do this, then you'd have a function that worked for _every_ type Env. But updating a list of (variable,value) pairs is a rather specific task. Fixing the syntax for this post: type Reader a = Env -> a Since your 'let' function modifies the environment, it yields a function of type Env -> Env = Reader Env Say that your 'let' has the type let :: Def -> Reader Env for a suitable type Def, e.g. Def = (String,Int). Suppose you have implemented 'let'. We know the Applicative and Monad type class functions won't help here. But observe that (.) can be specialized to (.) :: Reader a -> Reader Env -> Reader a whence you can write eval term . let newBinding where term :: Term eval :: Term -> Reader Int let :: Def -> Reader Env newBinding :: Def Cheers, Olaf From capn.freako at gmail.com Wed Sep 12 20:01:59 2018 From: capn.freako at gmail.com (David Banas) Date: Wed, 12 Sep 2018 13:01:59 -0700 Subject: [Haskell-cafe] Request for example of putting LaTeX in Haskell code comments for Haddock processing. Message-ID: <199B70CC-4128-493E-9658-C0F28E00C058@gmail.com> Hi all, Can anyone provide a know working example (GHC 8.4.3) of embedding LaTeX in Haskell source code comments intended for Haddock processing? Both in-line and block, if you’ve got them, would be great. Thanks! -db From rudy at matela.com.br Wed Sep 12 23:50:15 2018 From: rudy at matela.com.br (Rudy Matela) Date: Wed, 12 Sep 2018 20:50:15 -0300 Subject: [Haskell-cafe] ANN: leancheck-v0.7.4 with providers for Tasty, Hspec and test-framework In-Reply-To: <43d538ac-e8db-039c-58de-37af4281822a@htwk-leipzig.de> Message-ID: <20180912235015.GA15713@zero.localdomain> Hi, I'm glad you liked LeanCheck. :-) On Mon, 10 Sep 2018 15:09:43 +0200, Johannes Waldmann wrote: > Among the enumerative testing libraries, > the "competition" is between leancheck and smallcheck. > For, me the important differences are > > Enumeration of algebraic data types (ADTs), via built-in > serial combinators, e.g., cons0 Leaf \/ cons2 Branch > > * leancheck enumerate by size (number of nodes in tree) > * smallcheck enumerates by depth (longest path in tree) > > I (much!) prefer "by size" > because this delays combinatorial explosion somewhat. Yes, size bounded enumeration is more tractable. The combinatorial explosion is still there, but a lot of the times much later in the enumeration. LeanCheck also avoids repeating tests. SmallCheck may/will repeat tests when testing successive "depths". > Automated Listable/Serial instances for ADTs > > * leancheck uses template-haskell > * smallcheck uses DeriveGeneric > > Here I would actually prefer DeriveGeneric. Might I ask why the preference? > But this certainly could be added to leancheck? Yes it could. I guess LeanCheck could provide a genericTiers / defaultTiers function (of type something like `Generic a => [[a]]`) on an optional module so that one would write: import Test.LeanCheck.Generic instance Listable MyType where tiers = genericTiers Patches or GitHub pull requests are always welcome. :-) -- I may add this myself if I ever get the time. Another addition could be defining a "default" inside the Listable typeclass itself so that one could write `data MyType = ... deriving Listable`. But I'm not very keen on this second option, as it would require an extension on LeanCheck's core which is compliant with Haskell98+Namespaces. It would also hinder the beginner-friendliness of reading that module. (Or maybe there is a way to define "orphan" default declarations for typeclasses in separate modules?) > Oh, and leancheck does not use fancy typeclassery > (smallcheck needs Multiparamtypeclasses and Monads > just to declare a Serial instance) > and it is also easier to get at the list of counterexamples. This was intentional, I'm a fan of simple types and interfaces. I'm glad you liked this. - Rudy From frederic.cogny at gmail.com Thu Sep 13 14:23:40 2018 From: frederic.cogny at gmail.com (Frederic Cogny) Date: Thu, 13 Sep 2018 16:23:40 +0200 Subject: [Haskell-cafe] Request for example of putting LaTeX in Haskell code comments for Haddock processing. In-Reply-To: <199B70CC-4128-493E-9658-C0F28E00C058@gmail.com> References: <199B70CC-4128-493E-9658-C0F28E00C058@gmail.com> Message-ID: example with both below [and its rendering further down] syntax: - \( \) for inline and - \[ \] for block [image: image.png] [image: image.png] On Wed, Sep 12, 2018 at 10:02 PM David Banas wrote: > Hi all, > > Can anyone provide a know working example (GHC 8.4.3) of embedding LaTeX > in Haskell source code comments intended for Haddock processing? > Both in-line and block, if you’ve got them, would be great. > > Thanks! > -db > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -- Frederic Cogny +33 7 83 12 61 69 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image.png Type: image/png Size: 318986 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image.png Type: image/png Size: 205937 bytes Desc: not available URL: From david.feuer at gmail.com Thu Sep 13 14:29:47 2018 From: david.feuer at gmail.com (David Feuer) Date: Thu, 13 Sep 2018 10:29:47 -0400 Subject: [Haskell-cafe] Request for example of putting LaTeX in Haskell code comments for Haddock processing. In-Reply-To: <199B70CC-4128-493E-9658-C0F28E00C058@gmail.com> References: <199B70CC-4128-493E-9658-C0F28E00C058@gmail.com> Message-ID: Check out Data.Sequence, which uses it throughout. On Wed, Sep 12, 2018, 4:02 PM David Banas wrote: > Hi all, > > Can anyone provide a know working example (GHC 8.4.3) of embedding LaTeX > in Haskell source code comments intended for Haddock processing? > Both in-line and block, if you’ve got them, would be great. > > Thanks! > -db > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From capn.freako at gmail.com Thu Sep 13 16:04:54 2018 From: capn.freako at gmail.com (David Banas) Date: Thu, 13 Sep 2018 09:04:54 -0700 Subject: [Haskell-cafe] Request for example of putting LaTeX in Haskell code comments for Haddock processing. In-Reply-To: References: <199B70CC-4128-493E-9658-C0F28E00C058@gmail.com> Message-ID: <882AD45E-7604-4029-ACE4-75B85B9A949F@gmail.com> Thanks! I’m using the same syntax you are, but not seeing rendered equations in the resultant HTML, when viewed in Safari. (I’m about to try some other browsers.) I don’t need to give haddock any special option, do I? Is this a known issue w/ Safari? Is there a plug-in I need to install? Thanks, -db > On Sep 13, 2018, at 7:23 AM, Frederic Cogny wrote: > > example with both below [and its rendering further down] > > syntax: > - \( \) for inline and > - \[ \] for block > > > > > > > > On Wed, Sep 12, 2018 at 10:02 PM David Banas > wrote: > Hi all, > > Can anyone provide a know working example (GHC 8.4.3) of embedding LaTeX in Haskell source code comments intended for Haddock processing? > Both in-line and block, if you’ve got them, would be great. > > Thanks! > -db > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. > -- > Frederic Cogny > +33 7 83 12 61 69 -------------- next part -------------- An HTML attachment was scrubbed... URL: From Andrew.Butterfield at scss.tcd.ie Thu Sep 13 17:12:19 2018 From: Andrew.Butterfield at scss.tcd.ie (Andrew Butterfield) Date: Thu, 13 Sep 2018 18:12:19 +0100 Subject: [Haskell-cafe] Request for example of putting LaTeX in Haskell code comments for Haddock processing. In-Reply-To: <882AD45E-7604-4029-ACE4-75B85B9A949F@gmail.com> References: <199B70CC-4128-493E-9658-C0F28E00C058@gmail.com> <882AD45E-7604-4029-ACE4-75B85B9A949F@gmail.com> Message-ID: <2C5EA379-4961-4720-A8AC-2E47D466372C@scss.tcd.ie> Hi David, I can see the ones in Data.Sequence (e.g. http://hackage.haskell.org/package/containers-0.6.0.1/docs/Data-Sequence.html#g:1 ) using Safari 11.1.2 with macOS Sierra 10.12.6. The only extension installed is a divx html 5 video player. Regards, Andrew > On 13 Sep 2018, at 17:04, David Banas wrote: > > Thanks! > > I’m using the same syntax you are, but not seeing rendered equations in the resultant HTML, when viewed in Safari. > (I’m about to try some other browsers.) > I don’t need to give haddock any special option, do I? > Is this a known issue w/ Safari? > Is there a plug-in I need to install? > > Thanks, > -db > > >> On Sep 13, 2018, at 7:23 AM, Frederic Cogny > wrote: >> >> example with both below [and its rendering further down] >> >> syntax: >> - \( \) for inline and >> - \[ \] for block >> >> >> >> >> >> >> >> On Wed, Sep 12, 2018 at 10:02 PM David Banas > wrote: >> Hi all, >> >> Can anyone provide a know working example (GHC 8.4.3) of embedding LaTeX in Haskell source code comments intended for Haddock processing? >> Both in-line and block, if you’ve got them, would be great. >> >> Thanks! >> -db >> >> _______________________________________________ >> Haskell-Cafe mailing list >> To (un)subscribe, modify options or view archives go to: >> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe >> Only members subscribed via the mailman list are allowed to post. >> -- >> Frederic Cogny >> +33 7 83 12 61 69 > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------------------------------------------------------------- Andrew Butterfield Tel: +353-1-896-2517 Fax: +353-1-677-2204 Lero at TCD, Head of Foundations & Methods Research Group School of Computer Science and Statistics, Room G.39, O'Reilly Institute, Trinity College, University of Dublin http://www.scss.tcd.ie/Andrew.Butterfield/ -------------------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Screen Shot 2018-09-13 at 18.11.48.png Type: image/png Size: 11552 bytes Desc: not available URL: From neil_mayhew at users.sourceforge.net Thu Sep 13 17:33:50 2018 From: neil_mayhew at users.sourceforge.net (Neil Mayhew) Date: Thu, 13 Sep 2018 11:33:50 -0600 Subject: [Haskell-cafe] Request for example of putting LaTeX in Haskell code comments for Haddock processing. In-Reply-To: <2C5EA379-4961-4720-A8AC-2E47D466372C@scss.tcd.ie> References: <199B70CC-4128-493E-9658-C0F28E00C058@gmail.com> <882AD45E-7604-4029-ACE4-75B85B9A949F@gmail.com> <2C5EA379-4961-4720-A8AC-2E47D466372C@scss.tcd.ie> Message-ID: <409f0b58-2226-9c68-a36e-9b8e03b9865e@users.sourceforge.net> You do need to have JavaScript enabled, because the math rendering is being done with MathJax. What do you see at https://www.mathjax.org/#samples ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexander.vershilov at gmail.com Fri Sep 14 08:27:16 2018 From: alexander.vershilov at gmail.com (Alexander V Vershilov) Date: Fri, 14 Sep 2018 11:27:16 +0300 Subject: [Haskell-cafe] Parsing LocalTime from Unix seconds In-Reply-To: References: Message-ID: Hi Marc, The best way of answering such questions is to check the source code. Hackage provides a nice way of doing that - click on 'Source' near the instance that you are interested in: https://hackage.haskell.org/package/time-1.6.0.1/docs/Data-Time-Format.html#t:ParseTime And you'll see the implementation ``` instance ParseTime LocalTime where buildTime l xs = LocalTime <$> (buildTime l xs) <*> (buildTime l xs) ``` That builds time from `Day` and `TimeOfDay` passing your parse string to each of those. Then you can check ParseTime instance of Day: https://hackage.haskell.org/package/time-1.6.0.1/docs/src/Data.Time.Format.Parse.html#line-331 I'm not providing it here, as it's quite big, but the main point is that `s` is ignored so in that case Day appear to be: ``` rest (YearMonth m:_) = let d = safeLast 1 [x | MonthDay x <- cs] in fromGregorianValid y m d ``` with y=m=d=1 if you continue the process for TimeOfDay you'll find that `s` is ignored there as well, and `midnight = TimeOfDay 0 0 0` is returned in that case. So it appeared that LocalTime consists of the components that ignore your parse string and return default value instead. I don't know if that is intended behaviour or not, but for me it makes more sense to parse to UTCTime/POSIXTime and then convert into LocalTime, in case if you get seconds as input. Hope that helps. On Thu, 6 Sep 2018 at 13:42, Marc Busqué wrote: > > In GHCi > > ``` > :m +Data.Time > parseTimeM True defaultTimeLocale "%s" "1535684406" :: Maybe UTCTime > -- => Just 2018-08-31 03:00:06 UTC > parseTimeM True defaultTimeLocale "%s" "1535684406" :: Maybe LocalTime > -- => Just 1970-01-01 00:00:00 > ``` > > Why? ¯\(°_o)/¯ > > Marc Busqué > http://waiting-for-dev.github.io/about/_______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -- Alexander From marc at lamarciana.com Fri Sep 14 10:15:33 2018 From: marc at lamarciana.com (=?ISO-8859-15?Q?Marc_Busqu=E9?=) Date: Fri, 14 Sep 2018 12:15:33 +0200 (CEST) Subject: [Haskell-cafe] Parsing LocalTime from Unix seconds In-Reply-To: References: Message-ID: Thanks Alexander for your answer. I opened an issue in `time` repository: https://github.com/haskell/time/issues/104 It seems it is intended behaviour, but I think it is inconsistent and it makes difficult to parse from a string when you don't know beforehand the format it has. In the issue link there is the developed version of my answer... :) Marc Busqué http://waiting-for-dev.github.io/about/ On Fri, 14 Sep 2018, Alexander V Vershilov wrote: > Hi Marc, > > The best way of answering such questions is to check the source code. > Hackage provides > a nice way of doing that - click on 'Source' near the instance that > you are interested in: > > https://hackage.haskell.org/package/time-1.6.0.1/docs/Data-Time-Format.html#t:ParseTime > > And you'll see the implementation > > ``` > > instance ParseTime LocalTime where > buildTime l xs = LocalTime <$> (buildTime l xs) <*> (buildTime l xs) > ``` > > That builds time from `Day` and `TimeOfDay` passing your parse string > to each of those. > Then you can check ParseTime instance of Day: > > https://hackage.haskell.org/package/time-1.6.0.1/docs/src/Data.Time.Format.Parse.html#line-331 > > I'm not providing it here, as it's quite big, but the main point is > that `s` is ignored so in that case > Day appear to be: > > ``` > rest (YearMonth m:_) = let > d = safeLast 1 [x | MonthDay x <- cs] > in fromGregorianValid y m d > ``` > with y=m=d=1 > > if you continue the process for TimeOfDay you'll find that `s` is > ignored there as well, and > `midnight = TimeOfDay 0 0 0` is returned in that case. > > So it appeared that LocalTime consists of the components that ignore > your parse string and return > default value instead. > > I don't know if that is intended behaviour or not, but for me it makes > more sense to parse to UTCTime/POSIXTime > and then convert into LocalTime, in case if you get seconds as input. > > Hope that helps. > > > On Thu, 6 Sep 2018 at 13:42, Marc Busqué wrote: >> >> In GHCi >> >> ``` >> :m +Data.Time >> parseTimeM True defaultTimeLocale "%s" "1535684406" :: Maybe UTCTime >> -- => Just 2018-08-31 03:00:06 UTC >> parseTimeM True defaultTimeLocale "%s" "1535684406" :: Maybe LocalTime >> -- => Just 1970-01-01 00:00:00 >> ``` >> >> Why? ¯\(°_o)/¯ >> >> Marc Busqué >> http://waiting-for-dev.github.io/about/_______________________________________________ >> Haskell-Cafe mailing list >> To (un)subscribe, modify options or view archives go to: >> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe >> Only members subscribed via the mailman list are allowed to post. > > > > -- > Alexander > From olf at aatal-apotheke.de Fri Sep 14 20:11:46 2018 From: olf at aatal-apotheke.de (Olaf Klinke) Date: Fri, 14 Sep 2018 22:11:46 +0200 Subject: [Haskell-cafe] Parsing LocalTime from Unix seconds Message-ID: <55AA829D-D879-4C74-AAF9-FD72EB269A37@aatal-apotheke.de> Is the result of parsing Unix seconds to LocalTime even well-defined? A LocalTime could be in any time zone, while Unix epoch is relative to a certain time-point in UTC. Hence the parse result must depend on what time zone the LocalTime is referring to. Olaf From mail at joachim-breitner.de Sun Sep 16 08:14:18 2018 From: mail at joachim-breitner.de (Joachim Breitner) Date: Sun, 16 Sep 2018 10:14:18 +0200 Subject: [Haskell-cafe] Call for lightning talks and participation -- Haskell Implementors' Workshop Message-ID: Call for Contributions ACM SIGPLAN Haskell Implementors’ Workshop Sunday, 23 September, 2018 https://icfp18.sigplan.org/track/hiw-2018-papers Co-located with ICFP 2018 St. Louis, Missouri, US https://conf.researchr.org/home/icfp-2018 The Haskell Implementors Workshop is only one week away! Time to look at our great program at https://icfp18.sigplan.org/track/hiw-2018-papers#program and plan your day! Lightning Talks --------------- Like in the previous years, we will have slots for lightning talks. And because they were so successful last year, we will have more! *Topics* Anything related to Haskell implementations, fun uses of Haskell etc. goes. Feel free to tell us about ongoing work, to entertain, to rant, to stir a debate! (If you ever have been to a security or crypto conference, you might have attended their “rump session”. While there will not be alcohol involved at HIW, I hope that we can still match their creativity and insightful fun.) *Rules* * There are 3 sets of 3 lightning talks. * Sign-up is on day of the event, in person, on paper. No prior registration possible. * Lightning talks are 8 mins or less. If you know that your lightning talk takes less time, please say so, and maybe we can put four lightning talks into the slot. * Lightning talks do not count as peer-reviewed publications and are not published in the conference proceedings. Program Committee ----------------- * Edwin Brady (University of St. Andrews, UK) * Joachim Breitner – chair (DFINITY / University of Pennsylvania) * Ben Gamari (Well-Typed LLP) * Michael Hanus (Kiel University) * Roman Leshchinsky (Facebook) * Niki Vazou (University of Maryland) Contact ------- * Joachim Breitner -- Joachim Breitner former post-doctoral researcher http://cis.upenn.edu/~joachim -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 833 bytes Desc: This is a digitally signed message part URL: From tdammers at gmail.com Mon Sep 17 06:01:30 2018 From: tdammers at gmail.com (Tobias Dammers) Date: Mon, 17 Sep 2018 08:01:30 +0200 Subject: [Haskell-cafe] Parsing LocalTime from Unix seconds In-Reply-To: <55AA829D-D879-4C74-AAF9-FD72EB269A37@aatal-apotheke.de> References: <55AA829D-D879-4C74-AAF9-FD72EB269A37@aatal-apotheke.de> Message-ID: <20180917060129.jbanodarfpurpdce@nibbler> On Fri, Sep 14, 2018 at 10:11:46PM +0200, Olaf Klinke wrote: > > Is the result of parsing Unix seconds to LocalTime even well-defined? > A LocalTime could be in any time zone, while Unix epoch is relative to > a certain time-point in UTC. Hence the parse result must depend on > what time zone the LocalTime is referring to. It's even worse - unix time may also refer to the host's local timezone, although these days almost everything sets the RTC to UTC, and corrects for local timezone on the fly (simply because this allows for somewhat sane handling of DST and such). Also, unix time may represent either actual seconds elapsed since epoch, or "logical" seconds since epoch (ignoring leap seconds, such that midnight is always a multiple of 86400 seconds). However, once you pick a convention for the second part, you can convert unix timestamps to some "local time without timezone info" data structure (e.g. LocalTime); whether you assume UTC or any timezone becomes relevant when you want to convert that data structure into a data structure that does contain timezone info explicitly (like ZonedTime) or implicitly (like UTCTime). https://hackage.haskell.org/package/time-1.9.2/docs/Data-Time.html provides a nice overview of the various date/time types defined in the `time` package, and what they mean. The relevant ones here are: - UTCTime: idealized (ignoring leap seconds) absolute moment in time - LocalTime: moment in time, in some unspecified timezone - ZonedTime: moment in time, in a specific timezone And the "morally correct" way of parsing unix time into anything would be to go through LocalTime first, then either convert to UTCTime if the unix timestamp may be interpreted as being in UTC, or attaching the correct timezone, yielding a ZonedTime. All this assuming that you can afford to ignore leap seconds one way or another. From mailinglists at robbertkrebbers.nl Mon Sep 17 07:06:17 2018 From: mailinglists at robbertkrebbers.nl (Robbert Krebbers) Date: Mon, 17 Sep 2018 09:06:17 +0200 Subject: [Haskell-cafe] CoqPL 2019: Call for Presentations Message-ID: <7b0dd7d5-19c2-f6b7-cf50-74e0b9c41ced@robbertkrebbers.nl> =================================================================== CoqPL 2019 5th International Workshop on Coq for Programming Languages -- January 19, 2019, co-located with POPL Cascais/Lisbon, Portugal CALL FOR PRESENTATIONS https://popl19.sigplan.org/track/CoqPL-2019 =================================================================== Workshop Overview ----------------- The series of CoqPL workshops provide an opportunity for programming languages researchers to meet and interact with one another and members from the core Coq development team. At the meeting, we will discuss upcoming new features, see talks and demonstrations of exciting current projects, solicit feedback for potential future changes, and generally work to strengthen the vibrant community around our favorite proof assistant. Topics in scope include: - General purpose libraries and tactic language extensions - Domain-specific libraries for programming language formalization and verification - IDEs, profilers, tracers, debuggers, and testing tools - Reports on ongoing proof efforts conducted via (or in the context of) the Coq proof assistant - Experience reports from Coq usage in educational or industrial contexts Workshop Format --------------- The workshop format will be driven by you, members of the community. We will solicit abstracts for talks and proposals for demonstrations and flesh out format details based on responses. We expect the final program to include experiment reports, panel discussions, and invited talks (details TBA). Talks will be selected according to relevance to the workshop, based on the submission of an extended abstract. To foster open discussion of cutting edge research which can later be published in full conference proceedings, we will not publish papers from the workshop. However, presentations will be recorded and the videos made publicly available. Submission details ------------------ Submission page: https://coqpl19.hotcrp.com/ Submission: Monday, October 15, 2018. Notification: Thursday, November 8, 2018. Workshop: Saturday, January 19, 2019 (tentative, date to be confirmed soon). Submissions for talks and demonstrations should be described in an extended abstract, between 1 and 2 pages in length (excluding the bibliography). We suggest formatting the text using the two-column ACM SIGPLAN latex style (9pt font). Templates are available from the ACM SIGPLAN page: http://www.sigplan.org/Resources/Author. Program Committee ----------------- Chairs: - Robbert Krebbers Delft University of Technology, the Netherlands - Ilya Sergey University College London, UK Program Committee: - Olivier Danvy Yale-NUS College, Singapore - Ronghui Gu Columbia University, USA - William Mansky Princeton University, USA - Talia Ringer University of Washington, USA - Gordon Stewart Ohio University, USA - Enrico Tassi Inria, France - Anton Trunov IMDEA Software Institute, Spain - Edwin Westbrook Galois, Inc., USA - Steve Zdancewic University of Pennsylvania, USA From ietf-dane at dukhovni.org Mon Sep 17 13:12:38 2018 From: ietf-dane at dukhovni.org (Viktor Dukhovni) Date: Mon, 17 Sep 2018 09:12:38 -0400 Subject: [Haskell-cafe] Parsing LocalTime from Unix seconds In-Reply-To: <20180917060129.jbanodarfpurpdce@nibbler> References: <55AA829D-D879-4C74-AAF9-FD72EB269A37@aatal-apotheke.de> <20180917060129.jbanodarfpurpdce@nibbler> Message-ID: <65AB0B5C-2ACD-4DE4-A5A6-4491DBB4B8DC@dukhovni.org> > On Sep 17, 2018, at 2:01 AM, Tobias Dammers wrote: > > Also, unix time may represent either actual seconds elapsed since epoch, or > "logical" seconds since epoch (ignoring leap seconds, such that midnight > is always a multiple of 86400 seconds). That would be a violation of the specification: http://pubs.opengroup.org/onlinepubs/9699919799/basedefs/V1_chap04.html#tag_04_16 On all extant systems Unix Time is based on an 86400-second day, regardless of any leap seconds. The RTC clock has nothing to do with this, the epoch time is defined as an interval. It only becomes fuzzy when leap seconds are being handled, as different systems may handle the leap second in somewhat different ways. Since the epoch time is quantized anyhow to a 1s granularity, you can expect the epoch time reported by different systems to differ by +/-1s normally, even with clocks reasonably well synchronized, and +/-2s when leap seconds are being added if they're not both using the same adjustment algorithm (say NTP leap second smearing). -- Viktor. From neil_mayhew at users.sourceforge.net Mon Sep 17 19:29:21 2018 From: neil_mayhew at users.sourceforge.net (Neil Mayhew) Date: Mon, 17 Sep 2018 13:29:21 -0600 Subject: [Haskell-cafe] Parsing LocalTime from Unix seconds In-Reply-To: <20180917060129.jbanodarfpurpdce@nibbler> References: <55AA829D-D879-4C74-AAF9-FD72EB269A37@aatal-apotheke.de> <20180917060129.jbanodarfpurpdce@nibbler> Message-ID: On 2018-09-17 12:01 AM, Tobias Dammers wrote: > … you can convert unix timestamps to some "local time without timezone > info" data structure (e.g. LocalTime); whether you assume UTC or any > timezone becomes relevant when you want to convert that data structure > into a data structure that does contain timezone info explicitly (like > ZonedTime) or implicitly (like UTCTime). I don't think this is possible. If you want to measure seconds since 1970-01-01 00:00:00 local time (which may actually be 1969 in UTC) you have to know the timezone at the start (ie in 1970) and at the end (ie after the seconds have elapsed). That requires knowing whether DST applies at either end, and whether the location was rezoned during the intervening time. This requires IO and the conversion from seconds is no longer a pure operation. IHMO, the only sane way to interpret seconds-since-the-epoch is using UTC. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jo at durchholz.org Mon Sep 17 21:02:52 2018 From: jo at durchholz.org (Joachim Durchholz) Date: Mon, 17 Sep 2018 23:02:52 +0200 Subject: [Haskell-cafe] Parsing LocalTime from Unix seconds In-Reply-To: References: <55AA829D-D879-4C74-AAF9-FD72EB269A37@aatal-apotheke.de> <20180917060129.jbanodarfpurpdce@nibbler> Message-ID: <34ff5517-3714-27bc-4ece-d6bf8bfc438b@durchholz.org> Am 17.09.2018 um 21:29 schrieb Neil Mayhew: > On 2018-09-17 12:01 AM, Tobias Dammers wrote: >> … you can convert unix timestamps to some "local time without timezone >> info" data structure (e.g. LocalTime); whether you assume UTC or any >> timezone becomes relevant when you want to convert that data structure >> into a data structure that does contain timezone info explicitly (like >> ZonedTime) or implicitly (like UTCTime). > > I don't think this is possible. If you want to measure seconds since > 1970-01-01 00:00:00 local time (which may actually be 1969 in UTC) you > have to know the timezone at the start (ie in 1970) and at the end (ie > after the seconds have elapsed). That requires knowing whether DST > applies at either end, and whether the location was rezoned during the > intervening time. Actually you need only know the time offset at start and end of interval. Any intervening changes cancel out. You do need to know whether the location had its time zone changed (which happened multiple times in some areas I believe). > This requires IO and the conversion from seconds is no > longer a pure operation. Eek. IO just to handle timezone is a really bad idea. Just use a timestamp-with-timezone data type for the parameters and you're back with pure functions. > IHMO, the only sane way to interpret seconds-since-the-epoch is using UTC. Now that's true. From the point of view in the paragraph above, simply because it's lacking the data needed for any other interpretation. Note that there are excellent designs for time representation arounds. A good one would have to cover instants (multiple representations), intervals (with and without leap seconds), time zones, and calendars. From my daily Java work, the one that works best is the Jodatime library, documented at http://www.joda.org/joda-time/. If anybody wishes to check whether it has good ideas worth stealing, take a look at the Javadoc at http://www.joda.org/joda-time/apidocs/index.html. If you want to be really thorough, you can also study JSR-310, which was supposed to be a slightly improve version of Jodatime, AND part of the Java standard library. That work stopped when it was "good enough", which isn't good enough. Still, the differences between Jodatime and JSR-310 may be interesting in themselves (they may show areas where the Jodatime author believed his work could be improved, as he was deeply involved in the JSR as well). Anyway: the JSR-310 classes are documented at https://docs.oracle.com/javase/8/docs/api/index.html, in the java.time package and subpackages. Regards, Jo From monnier at iro.umontreal.ca Mon Sep 17 22:00:32 2018 From: monnier at iro.umontreal.ca (Stefan Monnier) Date: Mon, 17 Sep 2018 18:00:32 -0400 Subject: [Haskell-cafe] Parsing LocalTime from Unix seconds References: <55AA829D-D879-4C74-AAF9-FD72EB269A37@aatal-apotheke.de> <20180917060129.jbanodarfpurpdce@nibbler> Message-ID: > It's even worse - unix time may also refer to the host's local timezone, I don't know any unix-style system which does that. Any detail on where/when this could happen? Stefan From ietf-dane at dukhovni.org Tue Sep 18 01:15:44 2018 From: ietf-dane at dukhovni.org (Viktor Dukhovni) Date: Mon, 17 Sep 2018 21:15:44 -0400 Subject: [Haskell-cafe] Algebraic Effects? Message-ID: <038D0B86-6697-4ECB-9F36-53D9175B4D10@dukhovni.org> I picked up Haskell fairly recently, as a "better imperative programming language" to implement highly concurrent code to survey DNSSEC and DANE adoption on the Internet. The results are great, I got a DNS library, network and TLS stack that provide effortless concurrency, and a decent interface to Postgres in the form of the Hasql package and performance is excellent. But I'm still a novice in functional programming, with much to learn. So it is only this week that I've started to read about Algebraic effects, and I curious how the Haskell community views these nowadays. If this is a toxic topic raised by newbies who should just Google past discussions instead, feel free to say so... Does the below thread still sum up the situation: https://www.reddit.com/r/haskell/comments/3nkv2a/why_dont_we_use_effect_handlers_as_opposed_to/ I see Haskell now also has an Eff monad. Is it widely used? Efficient? Are there other Haskell libraries that build on it as a foundation? One potential advantage that comes to mind with Effects is that the exceptions raised by a computation can enter its signature and it becomes less likely that a library will leak unexpected exception types from its dependencies to its callers if the expected exceptions are explicit in the signatures and checked by the type system. For example, a while back the Haskell Network.DNS library leaked exceptions from a parser library that was an internal implementation detail, and my code had rare crashes on malformed DNS packets, since I did not expect or handle that exception. -- Viktor. From cdsmith at gmail.com Tue Sep 18 02:01:02 2018 From: cdsmith at gmail.com (Chris Smith) Date: Mon, 17 Sep 2018 22:01:02 -0400 Subject: [Haskell-cafe] Get-together at ICFP about Haskell in K-12 education In-Reply-To: References: Message-ID: I can now provide more details about this "Teaching Haskell in K-12" dinner I'm organizing in St. Louis during ICFP. We'll be having dinner at 7:30 pm on Monday, Sep 24, at Lombardo's Trattoria in St. Louis. This is directly across the road from the conference hotel. Please consider attending if you are interested in making connections, learning, or sharing your experience with using Haskell or similar languages in education prior to the university level. This could be anything from teaching your own kids to running an organized program at scale. Our goal is to connect people into a critical mass of community members who know each other, stay in contact, ask and answer questions, and share ideas and experiences - possibly leading to collaborations in research, volunteerism, sharing of resources and tools, etc. I'm expecting a fun group including a mix of backgrounds from academics to hobbyists to professional developers, and from Haskell newcomers to long-standing pillars of the Haskell community. If you would like to attend, *please RSVP with this interest form* . Thanks, Chris Smith On Sat, Sep 8, 2018 at 11:53 PM Chris Smith wrote: > Hello Haskell community! > > I've been floating an idea around about hosting a get-together in St. > Louis for anyone interested in Haskell at the K-12 (pre-university) level. > Some of you know that this has been a big passion of mine for the last 8 > years or so. I'd love to discuss with others while a lot of us are in the > same place -- whether you're interested in organized teaching, or just > teaching your own kids, or just curious about the idea. > > I've put together this form to collect RSVPs and availability. > > Interest Form > > > > Thanks, > Chris Smith > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vanessa.mchale at iohk.io Tue Sep 18 02:57:58 2018 From: vanessa.mchale at iohk.io (Vanessa McHale) Date: Mon, 17 Sep 2018 21:57:58 -0500 Subject: [Haskell-cafe] Algebraic Effects? In-Reply-To: <038D0B86-6697-4ECB-9F36-53D9175B4D10@dukhovni.org> References: <038D0B86-6697-4ECB-9F36-53D9175B4D10@dukhovni.org> Message-ID: You can certainly create a new type signature for things that can fail with error or undefined, but keep in mind that the *real* logical bottom, viz. infinite recursion, is still there. I know that Idris and ATS both have some mechanism for checking for non-termination (and in the case of ATS, it is dealt with as an algebraic effect I believe), but GHC would not truly be able to eliminate bottoms without writing an extension yourself. In the case of the bug you mentioned I'd guess it's just API stability/the Haskell ecosystem. I believe error and undefined are in the Haskell2010 report so I doubt they're going to stop causing pain anytime soon :) On 09/17/2018 08:15 PM, Viktor Dukhovni wrote: > I picked up Haskell fairly recently, as a "better imperative programming > language" to implement highly concurrent code to survey DNSSEC and DANE > adoption on the Internet. The results are great, I got a DNS library, > network and TLS stack that provide effortless concurrency, and a decent > interface to Postgres in the form of the Hasql package and performance > is excellent. > > But I'm still a novice in functional programming, with much to learn. > So it is only this week that I've started to read about Algebraic effects, > and I curious how the Haskell community views these nowadays. > > If this is a toxic topic raised by newbies who should just Google > past discussions instead, feel free to say so... > > Does the below thread still sum up the situation: > > https://www.reddit.com/r/haskell/comments/3nkv2a/why_dont_we_use_effect_handlers_as_opposed_to/ > > I see Haskell now also has an Eff monad. Is it widely used? Efficient? > Are there other Haskell libraries that build on it as a foundation? > > One potential advantage that comes to mind with Effects is that the > exceptions raised by a computation can enter its signature and it > becomes less likely that a library will leak unexpected exception > types from its dependencies to its callers if the expected exceptions > are explicit in the signatures and checked by the type system. > > For example, a while back the Haskell Network.DNS library leaked exceptions > from a parser library that was an internal implementation detail, and my code > had rare crashes on malformed DNS packets, since I did not expect or handle > that exception. > -- *Vanessa McHale* Functional Compiler Engineer | Chicago, IL Website: www.iohk.io Twitter: @vamchale PGP Key ID: 4209B7B5 Input Output Twitter Github LinkedIn This e-mail and any file transmitted with it are confidential and intended solely for the use of the recipient(s) to whom it is addressed. Dissemination, distribution, and/or copying of the transmission by anyone other than the intended recipient(s) is prohibited. If you have received this transmission in error please notify IOHK immediately and delete it from your system. E-mail transmissions cannot be guaranteed to be secure or error free. We do not accept liability for any loss, damage, or error arising from this transmission -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 488 bytes Desc: OpenPGP digital signature URL: From ietf-dane at dukhovni.org Tue Sep 18 03:51:11 2018 From: ietf-dane at dukhovni.org (Viktor Dukhovni) Date: Mon, 17 Sep 2018 23:51:11 -0400 Subject: [Haskell-cafe] Algebraic Effects? In-Reply-To: References: <038D0B86-6697-4ECB-9F36-53D9175B4D10@dukhovni.org> Message-ID: <206FD1B1-7D15-4BE4-A436-1363A4FABBFF@dukhovni.org> > On Sep 17, 2018, at 10:57 PM, Vanessa McHale wrote: > > You can certainly create a new type signature for things that can fail with error or undefined, but keep in mind that the *real* logical bottom, viz. infinite recursion, is still there. I know that Idris and ATS both have some mechanism for checking for non-termination (and in the case of ATS, it is dealt with as an algebraic effect I believe), but GHC would not truly be able to eliminate bottoms without writing an extension yourself. Given the novelty (to me) of Algebraic Effects, my question was intended to be broader than just whether they could help expose exception signatures. Are they likely to play a larger role in Haskell? Are they sufficiently simpler to reason about or use than monads to warrant thinking in terms of Effects instead in some/many cases? > In the case of the bug you mentioned I'd guess it's just API stability/the > Haskell ecosystem. I believe error and undefined are in the Haskell2010 report > so I doubt they're going to stop causing pain anytime soon :) Yes, of course, but I would still like to see libraries convert exceptions in underlying dependencies to something that might make sense to the caller of the library. Thus, (with no prejudice against the Network.DNS library, it just happens a core library in my project) I'd have expected the DNS library to return a some DNS-specific exception (malformed packet, ...) rather than an error from Attoparsec. And indeed this has been addressed. So that was just one possible advantage, but it seems the real win is supposed to be the ability to construct and compose lots of seemingly different primitives out of Effects (generators, concurrency, exceptions, state, ...). And so I am curious whether Haskell is likely some day to adopt and use Effects in some essential way, or whether they will remain a feature of peripheral libraries. Effects appear to be marketed as simpler to learn/use and to offer greater modularity than monads and monad transformers. Do they deliver on these promises, especially in larger projects? -- Viktor. From lexi.lambda at gmail.com Tue Sep 18 06:06:42 2018 From: lexi.lambda at gmail.com (Alexis King) Date: Tue, 18 Sep 2018 01:06:42 -0500 Subject: [Haskell-cafe] Algebraic Effects? In-Reply-To: <038D0B86-6697-4ECB-9F36-53D9175B4D10@dukhovni.org> References: <038D0B86-6697-4ECB-9F36-53D9175B4D10@dukhovni.org> Message-ID: I think this is a good question. It is one that I investigated in detail about a year ago. Here is a brief summary of my findings: - Haskell programmers want to compose effects, but they usually express effects with monads (e.g. Reader, State, Except), and monads don’t, in general, compose. Therefore, monad transformers were born. However, monad transformers have a new problem, which is an inability to parameterize a function over the exact set of effects in an overall computation. Therefore, mtl style was born. - In recent years, extensible effects libraries have proposed a compelling, less ad-hoc approach to effect composition than mtl style, but mtl style remains by far the most dominant approach to effect composition in Haskell libraries. - In Haskell, extensible effects libraries are historically based on either free monads[1] or “freer” monads[2]. The latter approach is newer, provides a nicer API (though that is admittedly subjective), and is faster due to some clever implementation tricks. However, even freer-based EE libraries are significantly slower than mtl style because the way effect handlers are implemented as ordinary functions defeats the inliner in ways mtl style does not. That said, this cost is in (>>=), which I find is often (usually?) insignificant compared to other costs, so while mtl spanks EE in microbenchmarks, I did not find a meaningful performance difference between mtl style and freer-based EE in real-world applications. - In my personal experience (with an admittedly very small sample size), novice Haskellers find defining new effects with the freer-simple EE library monumentally easier than with mtl style, the latter of which requires a deep understanding of monad transformers, mtl “lifting instances”, and things like newtype deriving or default signatures. (More on freer-simple later.) - The ecosystem of EE libraries is a mess. There are extensible-effects, freer, freer-effects, freer-simple, and others. As far as I can tell, extensible-effects is based on free monads, and freer and freer-effects are both unmaintained. My recommendation: if the performance of using EE is acceptable in your application AND you are willing to pay the cost of less ecosystem support (which in practice means needing to write adapters to mtl style libraries and having access to less documentation), I would strongly recommend the freer-simple extensible effect library. MASSIVE DISCLAIMER: I am the author and maintainer of freer-simple! However, I have a few reasons to believe I am not wholly biased: 1. I developed freer-simple only after using mtl style in production applications for nearly two years and thoroughly investigating the EE landscape. 2. I actually compared and contrasted, in practice, the difference in understanding between teaching mtl style, other EE libraries, and freer-simple to Haskell novices. 3. I have a number of satisfied customers.[3][4] The distinguishing features of freer-simple are better documentation and a dramatically different (and hopefully easier to understand) API for defining new effects compared to other extensible effects libraries. For details, see the freer-simple module documentation on Hackage here: https://hackage.haskell.org/package/freer-simple/docs/Control-Monad-Freer.html If you have any further questions, I’m happy to answer them, but this email is long enough already! Hopefully it isn’t too overwhelming. Alexis [1]: http://okmij.org/ftp/Haskell/extensible/exteff.pdf [2]: http://okmij.org/ftp/Haskell/extensible/more.pdf [3]: https://twitter.com/rob_rix/status/1034860773808459777 [4]: https://twitter.com/importantshock/status/1035989288708657153 > On Sep 17, 2018, at 20:15, Viktor Dukhovni wrote: > > > I picked up Haskell fairly recently, as a "better imperative programming > language" to implement highly concurrent code to survey DNSSEC and DANE > adoption on the Internet. The results are great, I got a DNS library, > network and TLS stack that provide effortless concurrency, and a decent > interface to Postgres in the form of the Hasql package and performance > is excellent. > > But I'm still a novice in functional programming, with much to learn. > So it is only this week that I've started to read about Algebraic effects, > and I curious how the Haskell community views these nowadays. > > If this is a toxic topic raised by newbies who should just Google > past discussions instead, feel free to say so... > > Does the below thread still sum up the situation: > > https://www.reddit.com/r/haskell/comments/3nkv2a/why_dont_we_use_effect_handlers_as_opposed_to/ > > I see Haskell now also has an Eff monad. Is it widely used? Efficient? > Are there other Haskell libraries that build on it as a foundation? > > One potential advantage that comes to mind with Effects is that the > exceptions raised by a computation can enter its signature and it > becomes less likely that a library will leak unexpected exception > types from its dependencies to its callers if the expected exceptions > are explicit in the signatures and checked by the type system. > > For example, a while back the Haskell Network.DNS library leaked exceptions > from a parser library that was an internal implementation detail, and my code > had rare crashes on malformed DNS packets, since I did not expect or handle > that exception. > > -- > Viktor. From c.sternagel at gmail.com Tue Sep 18 08:21:14 2018 From: c.sternagel at gmail.com (Christian Sternagel) Date: Tue, 18 Sep 2018 10:21:14 +0200 Subject: [Haskell-cafe] natural mergesort in Data.List Message-ID: <2af13628-38f8-a292-44f4-32919f565f1d@gmail.com> Dear Cafe, some years ago I formalized the mergesort implementation [1] from http://hackage.haskell.org/package/base-4.11.1.0/docs/src/Data.OldList.html#sort (as far as I can tell it didn't change since 2012) in the proof assistant Isabelle/HOL [2]. More specifically, I proved its functional correctness (the result is sorted and contains all elements of the input with exactly the same multiplicities) and that it is a stable sorting algorithm. Very recently I also formalized a complexity result in Isabelle/HOL, namely that the number of comparisons is bounded from above by n + n * ⌈log 2 n⌉ for lists of length n. For this proof I had to change the definition of "sequences", "ascending", and "descending" slightly. Now here is my question: Does anyone now of reasons why the current implementation of "sequences" is allowed to produce spurious empty lists in its result? The version I used in my formalization only differs in the following three spots: sequences [x] = [[x]] -- this is the only important change, since sequences [] = [] -- then the result does not contain empty lists instead of sequences xs = [xs] and ascending a as [] = let !x = as [a] in [x] instead of ascending a as bs = let !x = as [a] in x : sequences bs and descending a as [] = [a:as] instead of descending a as bs = (a:as) : sequences bs [1] https://www.isa-afp.org/entries/Efficient-Mergesort.html [2] http://isabelle.in.tum.de/ From mail at joachim-breitner.de Tue Sep 18 09:45:40 2018 From: mail at joachim-breitner.de (Joachim Breitner) Date: Tue, 18 Sep 2018 11:45:40 +0200 Subject: [Haskell-cafe] natural mergesort in Data.List In-Reply-To: <2af13628-38f8-a292-44f4-32919f565f1d@gmail.com> References: <2af13628-38f8-a292-44f4-32919f565f1d@gmail.com> Message-ID: <277e6540082c1b35533cd323b94c19ab137cccac.camel@joachim-breitner.de> Hi, Am Dienstag, den 18.09.2018, 10:21 +0200 schrieb Christian Sternagel: > some years ago I formalized the mergesort implementation [1] from > > > http://hackage.haskell.org/package/base-4.11.1.0/docs/src/Data.OldList.html#sort > > (as far as I can tell it didn't change since 2012) in the proof > assistant Isabelle/HOL [2]. > > More specifically, I proved its functional correctness (the result is > sorted and contains all elements of the input with exactly the same > multiplicities) and that it is a stable sorting algorithm. just a shameless plug: As part of our hs-to-coq project, we mechanically translated Data.List.sort to Coq, proved its termination (for finite inputs, of course) and functional correctness; https://github.com/antalsz/hs-to-coq/blob/master/examples/containers/theories/SortSorted.v > Very recently I also formalized a complexity result in Isabelle/HOL, > namely that the number of comparisons is bounded from above by > > n + n * ⌈log 2 n⌉ > > for lists of length n. Cool! That’s of course not easily possible with out shallow embedding into Coq. > For this proof I had to change the definition of "sequences", > "ascending", and "descending" slightly. > > Now here is my question: Does anyone now of reasons why the current > implementation of "sequences" is allowed to produce spurious empty lists > in its result? The version I used in my formalization only differs in > the following three spots: > … It _could_ have been benchmarked and measured and determined that it is actually faster to do less case analysis here. But I find that unlikely (both the investigation and this output), and it is probably simply an oversight, and I would expect that a patch to that effect would be appreciated ! Cheers, Joachim -- Joachim Breitner mail at joachim-breitner.de http://www.joachim-breitner.de/ -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 833 bytes Desc: This is a digitally signed message part URL: From david.feuer at gmail.com Tue Sep 18 09:55:30 2018 From: david.feuer at gmail.com (David Feuer) Date: Tue, 18 Sep 2018 05:55:30 -0400 Subject: [Haskell-cafe] natural mergesort in Data.List In-Reply-To: <2af13628-38f8-a292-44f4-32919f565f1d@gmail.com> References: <2af13628-38f8-a292-44f4-32919f565f1d@gmail.com> Message-ID: If you're guaranteeing that the result won't contain empty lists, it would be worth benchmarking the effect of using NonEmpty x instead of [x] in that spot. On Tue, Sep 18, 2018, 4:21 AM Christian Sternagel wrote: > Dear Cafe, > > some years ago I formalized the mergesort implementation [1] from > > > > http://hackage.haskell.org/package/base-4.11.1.0/docs/src/Data.OldList.html#sort > > (as far as I can tell it didn't change since 2012) in the proof > assistant Isabelle/HOL [2]. > > More specifically, I proved its functional correctness (the result is > sorted and contains all elements of the input with exactly the same > multiplicities) and that it is a stable sorting algorithm. > > Very recently I also formalized a complexity result in Isabelle/HOL, > namely that the number of comparisons is bounded from above by > > n + n * ⌈log 2 n⌉ > > for lists of length n. > > For this proof I had to change the definition of "sequences", > "ascending", and "descending" slightly. > > Now here is my question: Does anyone now of reasons why the current > implementation of "sequences" is allowed to produce spurious empty lists > in its result? The version I used in my formalization only differs in > the following three spots: > > sequences [x] = [[x]] -- this is the only important change, since > sequences [] = [] -- then the result does not contain empty lists > > instead of > > sequences xs = [xs] > > and > > ascending a as [] = let !x = as [a] in [x] > > instead of > > ascending a as bs = let !x = as [a] in x : sequences bs > > and > > descending a as [] = [a:as] > > instead of > > descending a as bs = (a:as) : sequences bs > > [1] https://www.isa-afp.org/entries/Efficient-Mergesort.html > [2] http://isabelle.in.tum.de/ > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From till at iks.cs.ovgu.de Tue Sep 18 10:06:07 2018 From: till at iks.cs.ovgu.de (Till Mossakowski) Date: Tue, 18 Sep 2018 12:06:07 +0200 Subject: [Haskell-cafe] Algebraic Effects? In-Reply-To: References: <038D0B86-6697-4ECB-9F36-53D9175B4D10@dukhovni.org> Message-ID: <18914563-19c0-1a73-d7eb-49aa76f0bbb6@iks.cs.ovgu.de> Have distributive laws [1] ever been used for monad composition in Haskell? After all, two monads with a distributive law compose. Till [1] https://ncatlab.org/nlab/show/distributive+law Am 18.09.2018 um 08:06 schrieb Alexis King: > I think this is a good question. It is one that I investigated in detail > about a year ago. Here is a brief summary of my findings: > > - Haskell programmers want to compose effects, but they usually > express effects with monads (e.g. Reader, State, Except), and monads > don’t, in general, compose. Therefore, monad transformers were born. > However, monad transformers have a new problem, which is an > inability to parameterize a function over the exact set of effects > in an overall computation. Therefore, mtl style was born. > > - In recent years, extensible effects libraries have proposed a > compelling, less ad-hoc approach to effect composition than mtl > style, but mtl style remains by far the most dominant approach to > effect composition in Haskell libraries. > > - In Haskell, extensible effects libraries are historically based > on either free monads[1] or “freer” monads[2]. The latter approach > is newer, provides a nicer API (though that is admittedly > subjective), and is faster due to some clever implementation tricks. > However, even freer-based EE libraries are significantly slower than > mtl style because the way effect handlers are implemented as > ordinary functions defeats the inliner in ways mtl style does not. > > That said, this cost is in (>>=), which I find is often (usually?) > insignificant compared to other costs, so while mtl spanks EE in > microbenchmarks, I did not find a meaningful performance difference > between mtl style and freer-based EE in real-world applications. > > - In my personal experience (with an admittedly very small sample > size), novice Haskellers find defining new effects with the > freer-simple EE library monumentally easier than with mtl style, the > latter of which requires a deep understanding of monad transformers, > mtl “lifting instances”, and things like newtype deriving or default > signatures. (More on freer-simple later.) > > - The ecosystem of EE libraries is a mess. There are > extensible-effects, freer, freer-effects, freer-simple, and others. > As far as I can tell, extensible-effects is based on free monads, > and freer and freer-effects are both unmaintained. > > My recommendation: if the performance of using EE is acceptable in your > application AND you are willing to pay the cost of less ecosystem > support (which in practice means needing to write adapters to mtl style > libraries and having access to less documentation), I would strongly > recommend the freer-simple extensible effect library. MASSIVE > DISCLAIMER: I am the author and maintainer of freer-simple! However, I > have a few reasons to believe I am not wholly biased: > > 1. I developed freer-simple only after using mtl style in production > applications for nearly two years and thoroughly investigating the > EE landscape. > > 2. I actually compared and contrasted, in practice, the difference > in understanding between teaching mtl style, other EE libraries, > and freer-simple to Haskell novices. > > 3. I have a number of satisfied customers.[3][4] > > The distinguishing features of freer-simple are better documentation and > a dramatically different (and hopefully easier to understand) API for > defining new effects compared to other extensible effects libraries. For > details, see the freer-simple module documentation on Hackage here: > > https://hackage.haskell.org/package/freer-simple/docs/Control-Monad-Freer.html > > If you have any further questions, I’m happy to answer them, but this > email is long enough already! Hopefully it isn’t too overwhelming. > > Alexis > > [1]: http://okmij.org/ftp/Haskell/extensible/exteff.pdf > [2]: http://okmij.org/ftp/Haskell/extensible/more.pdf > [3]: https://twitter.com/rob_rix/status/1034860773808459777 > [4]: https://twitter.com/importantshock/status/1035989288708657153 > >> On Sep 17, 2018, at 20:15, Viktor Dukhovni wrote: >> >> >> I picked up Haskell fairly recently, as a "better imperative programming >> language" to implement highly concurrent code to survey DNSSEC and DANE >> adoption on the Internet. The results are great, I got a DNS library, >> network and TLS stack that provide effortless concurrency, and a decent >> interface to Postgres in the form of the Hasql package and performance >> is excellent. >> >> But I'm still a novice in functional programming, with much to learn. >> So it is only this week that I've started to read about Algebraic effects, >> and I curious how the Haskell community views these nowadays. >> >> If this is a toxic topic raised by newbies who should just Google >> past discussions instead, feel free to say so... >> >> Does the below thread still sum up the situation: >> >> https://www.reddit.com/r/haskell/comments/3nkv2a/why_dont_we_use_effect_handlers_as_opposed_to/ >> >> I see Haskell now also has an Eff monad. Is it widely used? Efficient? >> Are there other Haskell libraries that build on it as a foundation? >> >> One potential advantage that comes to mind with Effects is that the >> exceptions raised by a computation can enter its signature and it >> becomes less likely that a library will leak unexpected exception >> types from its dependencies to its callers if the expected exceptions >> are explicit in the signatures and checked by the type system. >> >> For example, a while back the Haskell Network.DNS library leaked exceptions >> from a parser library that was an internal implementation detail, and my code >> had rare crashes on malformed DNS packets, since I did not expect or handle >> that exception. >> >> -- >> Viktor. > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. > From c.sternagel at gmail.com Tue Sep 18 13:14:31 2018 From: c.sternagel at gmail.com (Christian Sternagel) Date: Tue, 18 Sep 2018 15:14:31 +0200 Subject: [Haskell-cafe] natural mergesort in Data.List In-Reply-To: References: <2af13628-38f8-a292-44f4-32919f565f1d@gmail.com> Message-ID: <614f13bd-4275-f625-639e-dff097813e86@gmail.com> Dear David, I am guaranteeing (since I proved it in Isabelle/HOL) that the following version of "sequences" does not contain empty lists in its result (I am copying from my Isabelle formalization, in order to avoid typos) ;) fun sequences :: "'a list ⇒ 'a list list" and asc :: "'a ⇒ ('a list ⇒ 'a list) ⇒ 'a list ⇒ 'a list list" and desc :: "'a ⇒ 'a list ⇒ 'a list ⇒ 'a list list" where "sequences (a # b # xs) = (if key a > key b then desc b [a] xs else asc b ((#) a) xs)" | "sequences [x] = [[x]]" | "sequences [] = []" | "asc a as (b # bs) = (if ¬ key a > key b then asc b (λys. as (a # ys)) bs else as [a] # sequences (b # bs))" | "asc a as [] = [as [a]]" | "desc a as (b # bs) = (if key a > key b then desc b (a # as) bs else (a # as) # sequences (b # bs))" | "desc a as [] = [a # as]" The "key" function is an implicit first parameter for each of "sequences", "asc", and "desc" above. The fact that I am using a "key" function instead of a comparator is due to Isabelle/HOL's standard library. Also, there are no pattern guards in Isabelle/HOL. Anyway, it should be relatively straight-forward to translate these functions into Haskell. Another thing: I just realized that "merge_pairs" in my formalization also differs from "mergePairs", since with the changed "sequences" it might actually get an empty list as input, in which case the current "mergePairs" wouldn't terminate at all. So for those who are interested, the full definition of mergesort can be found here https://devel.isa-afp.org/browser_info/current/AFP/Efficient-Mergesort/Efficient_Mergesort.html where mergesort is called "msort_key". cheers chris Btw: What is "NonEmpty x"? On 09/18/2018 11:55 AM, David Feuer wrote: > If you're guaranteeing that the result won't contain empty lists, it > would be worth benchmarking the effect of using NonEmpty x instead of > [x] in that spot. > > On Tue, Sep 18, 2018, 4:21 AM Christian Sternagel > wrote: > > Dear Cafe, > > some years ago I formalized the mergesort implementation [1] from > > > http://hackage.haskell.org/package/base-4.11.1.0/docs/src/Data.OldList.html#sort > > (as far as I can tell it didn't change since 2012) in the proof > assistant Isabelle/HOL [2]. > > More specifically, I proved its functional correctness (the result is > sorted and contains all elements of the input with exactly the same > multiplicities) and that it is a stable sorting algorithm. > > Very recently I also formalized a complexity result in Isabelle/HOL, > namely that the number of comparisons is bounded from above by > >   n + n * ⌈log 2 n⌉ > > for lists of length n. > > For this proof I had to change the definition of "sequences", > "ascending", and "descending" slightly. > > Now here is my question: Does anyone now of reasons why the current > implementation of "sequences" is allowed to produce spurious empty lists > in its result? The version I used in my formalization only differs in > the following three spots: > >   sequences [x] = [[x]] -- this is the only important change, since >   sequences [] = []     -- then the result does not contain empty lists > > instead of > >   sequences xs = [xs] > > and > >   ascending a as [] = let !x = as [a] in [x] > > instead of > >   ascending a as bs = let !x = as [a] in x : sequences bs > > and > >   descending a as []  = [a:as] > > instead of > >   descending a as bs = (a:as) : sequences bs > > [1] https://www.isa-afp.org/entries/Efficient-Mergesort.html > [2] http://isabelle.in.tum.de/ > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. > From david.feuer at gmail.com Tue Sep 18 13:18:28 2018 From: david.feuer at gmail.com (David Feuer) Date: Tue, 18 Sep 2018 09:18:28 -0400 Subject: [Haskell-cafe] natural mergesort in Data.List In-Reply-To: <614f13bd-4275-f625-639e-dff097813e86@gmail.com> References: <2af13628-38f8-a292-44f4-32919f565f1d@gmail.com> <614f13bd-4275-f625-639e-dff097813e86@gmail.com> Message-ID: data NonEmpty a = a :| [a] It's a nonempty list, defined in Data.List.NonEmpty. On Tue, Sep 18, 2018, 9:14 AM Christian Sternagel wrote: > Dear David, > > I am guaranteeing (since I proved it in Isabelle/HOL) that the following > version of "sequences" does not contain empty lists in its result (I am > copying from my Isabelle formalization, in order to avoid typos) ;) > > fun sequences :: "'a list ⇒ 'a list list" > and asc :: "'a ⇒ ('a list ⇒ 'a list) ⇒ 'a list ⇒ 'a list list" > and desc :: "'a ⇒ 'a list ⇒ 'a list ⇒ 'a list list" > where > "sequences (a # b # xs) = > (if key a > key b then desc b [a] xs else asc b ((#) a) xs)" > | "sequences [x] = [[x]]" > | "sequences [] = []" > | "asc a as (b # bs) = > (if ¬ key a > key b then asc b (λys. as (a # ys)) bs > else as [a] # sequences (b # bs))" > | "asc a as [] = [as [a]]" > | "desc a as (b # bs) = > (if key a > key b then desc b (a # as) bs > else (a # as) # sequences (b # bs))" > | "desc a as [] = [a # as]" > > The "key" function is an implicit first parameter for each of > "sequences", "asc", and "desc" above. The fact that I am using a "key" > function instead of a comparator is due to Isabelle/HOL's standard > library. Also, there are no pattern guards in Isabelle/HOL. Anyway, it > should be relatively straight-forward to translate these functions into > Haskell. > > Another thing: I just realized that "merge_pairs" in my formalization > also differs from "mergePairs", since with the changed "sequences" it > might actually get an empty list as input, in which case the current > "mergePairs" wouldn't terminate at all. > > So for those who are interested, the full definition of mergesort can be > found here > > > > https://devel.isa-afp.org/browser_info/current/AFP/Efficient-Mergesort/Efficient_Mergesort.html > > where mergesort is called "msort_key". > > cheers > > chris > > Btw: What is "NonEmpty x"? > > On 09/18/2018 11:55 AM, David Feuer wrote: > > If you're guaranteeing that the result won't contain empty lists, it > > would be worth benchmarking the effect of using NonEmpty x instead of > > [x] in that spot. > > > > On Tue, Sep 18, 2018, 4:21 AM Christian Sternagel > > wrote: > > > > Dear Cafe, > > > > some years ago I formalized the mergesort implementation [1] from > > > > > > > http://hackage.haskell.org/package/base-4.11.1.0/docs/src/Data.OldList.html#sort > > > > (as far as I can tell it didn't change since 2012) in the proof > > assistant Isabelle/HOL [2]. > > > > More specifically, I proved its functional correctness (the result is > > sorted and contains all elements of the input with exactly the same > > multiplicities) and that it is a stable sorting algorithm. > > > > Very recently I also formalized a complexity result in Isabelle/HOL, > > namely that the number of comparisons is bounded from above by > > > > n + n * ⌈log 2 n⌉ > > > > for lists of length n. > > > > For this proof I had to change the definition of "sequences", > > "ascending", and "descending" slightly. > > > > Now here is my question: Does anyone now of reasons why the current > > implementation of "sequences" is allowed to produce spurious empty > lists > > in its result? The version I used in my formalization only differs in > > the following three spots: > > > > sequences [x] = [[x]] -- this is the only important change, since > > sequences [] = [] -- then the result does not contain empty > lists > > > > instead of > > > > sequences xs = [xs] > > > > and > > > > ascending a as [] = let !x = as [a] in [x] > > > > instead of > > > > ascending a as bs = let !x = as [a] in x : sequences bs > > > > and > > > > descending a as [] = [a:as] > > > > instead of > > > > descending a as bs = (a:as) : sequences bs > > > > [1] https://www.isa-afp.org/entries/Efficient-Mergesort.html > > [2] http://isabelle.in.tum.de/ > > _______________________________________________ > > Haskell-Cafe mailing list > > To (un)subscribe, modify options or view archives go to: > > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > > Only members subscribed via the mailman list are allowed to post. > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From c.sternagel at gmail.com Tue Sep 18 14:38:14 2018 From: c.sternagel at gmail.com (Christian Sternagel) Date: Tue, 18 Sep 2018 16:38:14 +0200 Subject: [Haskell-cafe] natural mergesort in Data.List In-Reply-To: References: <2af13628-38f8-a292-44f4-32919f565f1d@gmail.com> <614f13bd-4275-f625-639e-dff097813e86@gmail.com> Message-ID: Dear David, thanks for the pointer. Btw: I was able to modify my complexity proof so that "sequences" is no longer required to only contain non-empty lists. Sorry for the noise. But maybe such a "sequences" is not entirely useless: Did I understand correctly that what you are saying is that knowing that all elements in "sequences" are non-empty lists might have some impact on performance due to being able to use "NonEmpty a"? cheers chris On 09/18/2018 03:18 PM, David Feuer wrote: > data NonEmpty a = a :| [a] > > It's a nonempty list, defined in Data.List.NonEmpty. > > On Tue, Sep 18, 2018, 9:14 AM Christian Sternagel > wrote: > > Dear David, > > I am guaranteeing (since I proved it in Isabelle/HOL) that the following > version of "sequences" does not contain empty lists in its result (I am > copying from my Isabelle formalization, in order to avoid typos) ;) > > fun sequences :: "'a list ⇒ 'a list list" >   and asc :: "'a ⇒ ('a list ⇒ 'a list) ⇒ 'a list ⇒ 'a list list" >   and desc :: "'a ⇒ 'a list ⇒ 'a list ⇒ 'a list list" >   where >     "sequences (a # b # xs) = >       (if key a > key b then desc b [a] xs else asc b ((#) a) xs)" >   | "sequences [x] = [[x]]" >   | "sequences [] = []" >   | "asc a as (b # bs) = >       (if ¬ key a > key b then asc b (λys. as (a # ys)) bs >       else as [a] # sequences (b # bs))" >   | "asc a as [] = [as [a]]" >   | "desc a as (b # bs) = >       (if key a > key b then desc b (a # as) bs >       else (a # as) # sequences (b # bs))" >   | "desc a as [] = [a # as]" > > The "key" function is an implicit first parameter for each of > "sequences", "asc", and "desc" above. The fact that I am using a "key" > function instead of a comparator is due to Isabelle/HOL's standard > library. Also, there are no pattern guards in Isabelle/HOL. Anyway, it > should be relatively straight-forward to translate these functions into > Haskell. > > Another thing: I just realized that "merge_pairs" in my formalization > also differs from "mergePairs", since with the changed "sequences" it > might actually get an empty list as input, in which case the current > "mergePairs" wouldn't terminate at all. > > So for those who are interested, the full definition of mergesort can be > found here > > > https://devel.isa-afp.org/browser_info/current/AFP/Efficient-Mergesort/Efficient_Mergesort.html > > where mergesort is called "msort_key". > > cheers > > chris > > Btw: What is "NonEmpty x"? > > On 09/18/2018 11:55 AM, David Feuer wrote: > > If you're guaranteeing that the result won't contain empty lists, it > > would be worth benchmarking the effect of using NonEmpty x instead of > > [x] in that spot. > > > > On Tue, Sep 18, 2018, 4:21 AM Christian Sternagel > > > >> wrote: > > > >     Dear Cafe, > > > >     some years ago I formalized the mergesort implementation [1] from > > > > > >    >  http://hackage.haskell.org/package/base-4.11.1.0/docs/src/Data.OldList.html#sort > > > >     (as far as I can tell it didn't change since 2012) in the proof > >     assistant Isabelle/HOL [2]. > > > >     More specifically, I proved its functional correctness (the > result is > >     sorted and contains all elements of the input with exactly the > same > >     multiplicities) and that it is a stable sorting algorithm. > > > >     Very recently I also formalized a complexity result in > Isabelle/HOL, > >     namely that the number of comparisons is bounded from above by > > > >       n + n * ⌈log 2 n⌉ > > > >     for lists of length n. > > > >     For this proof I had to change the definition of "sequences", > >     "ascending", and "descending" slightly. > > > >     Now here is my question: Does anyone now of reasons why the > current > >     implementation of "sequences" is allowed to produce spurious > empty lists > >     in its result? The version I used in my formalization only > differs in > >     the following three spots: > > > >       sequences [x] = [[x]] -- this is the only important change, > since > >       sequences [] = []     -- then the result does not contain > empty lists > > > >     instead of > > > >       sequences xs = [xs] > > > >     and > > > >       ascending a as [] = let !x = as [a] in [x] > > > >     instead of > > > >       ascending a as bs = let !x = as [a] in x : sequences bs > > > >     and > > > >       descending a as []  = [a:as] > > > >     instead of > > > >       descending a as bs = (a:as) : sequences bs > > > >     [1] https://www.isa-afp.org/entries/Efficient-Mergesort.html > >     [2] http://isabelle.in.tum.de/ > >     _______________________________________________ > >     Haskell-Cafe mailing list > >     To (un)subscribe, modify options or view archives go to: > >     http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > >     Only members subscribed via the mailman list are allowed to post. > > > From david.feuer at gmail.com Tue Sep 18 14:58:24 2018 From: david.feuer at gmail.com (David Feuer) Date: Tue, 18 Sep 2018 10:58:24 -0400 Subject: [Haskell-cafe] natural mergesort in Data.List In-Reply-To: References: <2af13628-38f8-a292-44f4-32919f565f1d@gmail.com> <614f13bd-4275-f625-639e-dff097813e86@gmail.com> Message-ID: It may. Or it may not. The performance impact of such small changes can be hard to predict. On Tue, Sep 18, 2018, 10:38 AM Christian Sternagel wrote: > Dear David, > > thanks for the pointer. > > Btw: I was able to modify my complexity proof so that "sequences" is no > longer required to only contain non-empty lists. Sorry for the noise. > > But maybe such a "sequences" is not entirely useless: Did I understand > correctly that what you are saying is that knowing that all elements in > "sequences" are non-empty lists might have some impact on performance > due to being able to use "NonEmpty a"? > > cheers > > chris > > On 09/18/2018 03:18 PM, David Feuer wrote: > > data NonEmpty a = a :| [a] > > > > It's a nonempty list, defined in Data.List.NonEmpty. > > > > On Tue, Sep 18, 2018, 9:14 AM Christian Sternagel > > wrote: > > > > Dear David, > > > > I am guaranteeing (since I proved it in Isabelle/HOL) that the > following > > version of "sequences" does not contain empty lists in its result (I > am > > copying from my Isabelle formalization, in order to avoid typos) ;) > > > > fun sequences :: "'a list ⇒ 'a list list" > > and asc :: "'a ⇒ ('a list ⇒ 'a list) ⇒ 'a list ⇒ 'a list list" > > and desc :: "'a ⇒ 'a list ⇒ 'a list ⇒ 'a list list" > > where > > "sequences (a # b # xs) = > > (if key a > key b then desc b [a] xs else asc b ((#) a) xs)" > > | "sequences [x] = [[x]]" > > | "sequences [] = []" > > | "asc a as (b # bs) = > > (if ¬ key a > key b then asc b (λys. as (a # ys)) bs > > else as [a] # sequences (b # bs))" > > | "asc a as [] = [as [a]]" > > | "desc a as (b # bs) = > > (if key a > key b then desc b (a # as) bs > > else (a # as) # sequences (b # bs))" > > | "desc a as [] = [a # as]" > > > > The "key" function is an implicit first parameter for each of > > "sequences", "asc", and "desc" above. The fact that I am using a > "key" > > function instead of a comparator is due to Isabelle/HOL's standard > > library. Also, there are no pattern guards in Isabelle/HOL. Anyway, > it > > should be relatively straight-forward to translate these functions > into > > Haskell. > > > > Another thing: I just realized that "merge_pairs" in my formalization > > also differs from "mergePairs", since with the changed "sequences" it > > might actually get an empty list as input, in which case the current > > "mergePairs" wouldn't terminate at all. > > > > So for those who are interested, the full definition of mergesort > can be > > found here > > > > > > > https://devel.isa-afp.org/browser_info/current/AFP/Efficient-Mergesort/Efficient_Mergesort.html > > > > where mergesort is called "msort_key". > > > > cheers > > > > chris > > > > Btw: What is "NonEmpty x"? > > > > On 09/18/2018 11:55 AM, David Feuer wrote: > > > If you're guaranteeing that the result won't contain empty lists, > it > > > would be worth benchmarking the effect of using NonEmpty x instead > of > > > [x] in that spot. > > > > > > On Tue, Sep 18, 2018, 4:21 AM Christian Sternagel > > > > > >> > wrote: > > > > > > Dear Cafe, > > > > > > some years ago I formalized the mergesort implementation [1] > from > > > > > > > > > > > > http://hackage.haskell.org/package/base-4.11.1.0/docs/src/Data.OldList.html#sort > > > > > > (as far as I can tell it didn't change since 2012) in the proof > > > assistant Isabelle/HOL [2]. > > > > > > More specifically, I proved its functional correctness (the > > result is > > > sorted and contains all elements of the input with exactly the > > same > > > multiplicities) and that it is a stable sorting algorithm. > > > > > > Very recently I also formalized a complexity result in > > Isabelle/HOL, > > > namely that the number of comparisons is bounded from above by > > > > > > n + n * ⌈log 2 n⌉ > > > > > > for lists of length n. > > > > > > For this proof I had to change the definition of "sequences", > > > "ascending", and "descending" slightly. > > > > > > Now here is my question: Does anyone now of reasons why the > > current > > > implementation of "sequences" is allowed to produce spurious > > empty lists > > > in its result? The version I used in my formalization only > > differs in > > > the following three spots: > > > > > > sequences [x] = [[x]] -- this is the only important change, > > since > > > sequences [] = [] -- then the result does not contain > > empty lists > > > > > > instead of > > > > > > sequences xs = [xs] > > > > > > and > > > > > > ascending a as [] = let !x = as [a] in [x] > > > > > > instead of > > > > > > ascending a as bs = let !x = as [a] in x : sequences bs > > > > > > and > > > > > > descending a as [] = [a:as] > > > > > > instead of > > > > > > descending a as bs = (a:as) : sequences bs > > > > > > [1] https://www.isa-afp.org/entries/Efficient-Mergesort.html > > > [2] http://isabelle.in.tum.de/ > > > _______________________________________________ > > > Haskell-Cafe mailing list > > > To (un)subscribe, modify options or view archives go to: > > > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > > > Only members subscribed via the mailman list are allowed to > post. > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ietf-dane at dukhovni.org Tue Sep 18 15:49:54 2018 From: ietf-dane at dukhovni.org (Viktor Dukhovni) Date: Tue, 18 Sep 2018 11:49:54 -0400 Subject: [Haskell-cafe] Algebraic Effects? In-Reply-To: References: <038D0B86-6697-4ECB-9F36-53D9175B4D10@dukhovni.org> Message-ID: <20C39866-9BBD-474E-B5F0-7F7B7861FED4@dukhovni.org> > On Sep 18, 2018, at 2:06 AM, Alexis King wrote: > > - The ecosystem of EE libraries is a mess. Yes, it is rather unclear to me whether exploring any of these is is worth the effort, if so which, or how to use them given sometimes scant documentation. > There are > extensible-effects, freer, freer-effects, freer-simple, and others. > As far as I can tell, extensible-effects is based on free monads, > and freer and freer-effects are both unmaintained. I took a quick look at: https://hackage.haskell.org/package/extensible-0.4.10. The author claims good performance: https://www.schoolofhaskell.com/user/fumieval/extensible/the-world-s-fastest-extensible-effects-framework I've not tried any benchmarks or yet any non-trivial code using this library. The documentation is rather minimal, but I got the below to compile and run: {-# LANGUAGE DataKinds #-} {-# LANGUAGE TypeOperators #-} module Main where import Control.Monad.Reader (ask) import Control.Monad.State (get, put) import Control.Monad.Writer (tell) import Control.Monad.IO.Class (liftIO) import Data.Monoid (Sum(..)) import Data.Extensible import Data.Extensible.Effect.Default type IOeff = "IO" :> IO type KitchenSink a = Eff '[ReaderDef Int, StateDef Int, WriterDef (Sum Int), IOeff] a type JustIO a = Eff '[IOeff] a type Result a = ((a, Int), Sum Int) handler :: KitchenSink a -> JustIO (Result a) handler = runWriterDef . flip runStateDef 3 . flip runReaderDef 5 main :: IO () main = do x <- retractEff $ handler $ do liftIO $ putStrLn "running" s <- get r <- ask tell (Sum s) tell (Sum $ s + 1) put $! s + r return $ "magic" print x it outputs: running (("magic",8),Sum {getSum = 7}) With this library, at least when building effects out of mtl transformers, the order of the effects in the Eff type declaration has to match in reverse order the composition of the "runFooDef" functions. That is, the types: Eff '[ReaderDef r, StateDef s] and Eff '[StateDef s, ReaderDef r] are not the same. Perhaps this is a feature? > > My recommendation: if the performance of using EE is acceptable in your > application AND you are willing to pay the cost of less ecosystem > support (which in practice means needing to write adapters to mtl style > libraries and having access to less documentation), I would strongly > recommend the freer-simple extensible effect library. MASSIVE > DISCLAIMER: I am the author and maintainer of freer-simple! However, I > have a few reasons to believe I am not wholly biased: Thanks. I'll take a look. Any comments on similarities to or differences from the "extensible" package above? > The distinguishing features of freer-simple are better documentation Barring major downsides, that's a compelling difference. > and a dramatically different (and hopefully easier to understand) API for > defining new effects compared to other extensible effects libraries. For > details, see the freer-simple module documentation on Hackage here: > > https://hackage.haskell.org/package/freer-simple/docs/Control-Monad-Freer.html > > If you have any further questions, I’m happy to answer them, but this > email is long enough already! Hopefully it isn’t too overwhelming. Much appreciated. Still trying to figure out whether to look into this further. My project runs a concurrent computation for many hours, allocating and freeing terabytes of memory over its lifetime: 5,019,533,368,664 bytes allocated in the heap 162,945,132,824 bytes copied during GC 73,229,680 bytes maximum residency (3421 sample(s)) 4,150,592 bytes maximum slop 356 MB total memory in use (83 MB lost due to fragmentation) One concern for me is whether using Effects is likely to cause more allocations and work for the GC, or is the memory pressure about the same? -- Viktor. From olf at aatal-apotheke.de Tue Sep 18 20:53:48 2018 From: olf at aatal-apotheke.de (Olaf Klinke) Date: Tue, 18 Sep 2018 22:53:48 +0200 Subject: [Haskell-cafe] Algebraic Effects? Message-ID: <85EEA099-77D4-4DC9-A78D-3353E17E71BB@aatal-apotheke.de> > Have distributive laws [1] ever been used for monad composition in > Haskell? After all, two monads with a distributive law compose. > > Till There is the distributive package on Hackage, which at least matches the type of the nLab definition. Many common monad transformers stem from monads that are "universally distributive" in the following sense: If t is a monad transformer, then for all other monads m a function with one of the following types can be defined: t Identity (m a) -> m (t Identity a) m (t Identity a) -> t Identity (m a) For which transformers is this the case? For those where t m a is isomorphic to one of the two types t Identity (m a) m (t Identity a) Examples: distr_Reader :: Functor m => m (r -> a) -> r -> m a distr_Reader = \m -> \r -> fmap ($r) m distr_Writer :: Functor m => (w,m a) -> m (w,a) distr_Writer = \(w,m) -> fmap ((,) w) m distr_Either :: Applicative m => Either e (m a) -> m (Either e a) distr_Either (Left e) = pure (Left e) distr_Either (Right m) = fmap Right m distr_Maybe :: Applicative m => Maybe (m a) -> m (Maybe a) distr_Maybe Nothing = pure Nothing distr_Maybe (Just m) = fmap Just m More generally, two monads distribute when one is an endofunctor on the Kleisli category of the other. (The nLab page says Eilenberg-Moore category, I'm not sure whether the two statements are equivalent.) A morphism in the Kleisli category of m is a function a -> m b. An endofunctor f on the Kleisli category must map such a function to a function f a -> m (f b). For example, kleisli_Maybe :: Applicative m => (a -> m b) -> Maybe a -> m (Maybe b) kleisli_Maybe _ Nothing = pure Nothing kleisli_Maybe k (Just a) = fmap Just (k a) kleisli_Reader :: Functor m => (a -> r -> b) -> m a -> r -> m b kleisli_Reader k x r = fmap (flip k r) x Notice that in the former case we used the Kleisli category of m whereas in the latter case we used the Kleisli category of Reader. The more general picture involves an adjunction and the monad m. This covers StateT and the continuation monad. It is well-known that any adjunction F -| G between two categories gives rise to a monad, namely GF. Suppose F:C->D and G:D->C are functors. The adjuntion says that the hom-sets D(Fa,b) and C(a,Gb) are naturally isomorphic. For the state monad, C = Hask (the category of Haskell types and functions) D = Hask, F = (,) s G = (->) s and the isomorphism of hom-sets is currying and uncurrying between (s,a) -> b and a -> (s -> b). In case of the continuation monad Cont r a = (a -> r) -> r, we have C = Hask, D = Hask^op (Hask with direction of arrows reversed) F a = (a -> r) G b = (b -> r) and the isomorphism of hom-sets is between a -> (b -> r) and b -> (a -> r) where the latter is Hask^op(Fa,b). In the case of StateT, since the intermediate category D is Hask, any functor m is an endofunctor on the intermediate category. I do not recall under which conditions the composite GMF is a monad. Olaf From zemyla at gmail.com Wed Sep 19 02:56:17 2018 From: zemyla at gmail.com (Zemyla) Date: Tue, 18 Sep 2018 21:56:17 -0500 Subject: [Haskell-cafe] Algebraic Effects? In-Reply-To: <18914563-19c0-1a73-d7eb-49aa76f0bbb6@iks.cs.ovgu.de> References: <038D0B86-6697-4ECB-9F36-53D9175B4D10@dukhovni.org> <18914563-19c0-1a73-d7eb-49aa76f0bbb6@iks.cs.ovgu.de> Message-ID: The only monads with distributive laws are in Data.Distributive, and they're all isomorphic to (->) e for some e. So that's effectively ReaderT. On Tue, Sep 18, 2018 at 5:06 AM, Till Mossakowski wrote: > Have distributive laws [1] ever been used for monad composition in > Haskell? After all, two monads with a distributive law compose. > > Till > > [1] https://ncatlab.org/nlab/show/distributive+law > > Am 18.09.2018 um 08:06 schrieb Alexis King: >> I think this is a good question. It is one that I investigated in detail >> about a year ago. Here is a brief summary of my findings: >> >> - Haskell programmers want to compose effects, but they usually >> express effects with monads (e.g. Reader, State, Except), and monads >> don’t, in general, compose. Therefore, monad transformers were born. >> However, monad transformers have a new problem, which is an >> inability to parameterize a function over the exact set of effects >> in an overall computation. Therefore, mtl style was born. >> >> - In recent years, extensible effects libraries have proposed a >> compelling, less ad-hoc approach to effect composition than mtl >> style, but mtl style remains by far the most dominant approach to >> effect composition in Haskell libraries. >> >> - In Haskell, extensible effects libraries are historically based >> on either free monads[1] or “freer” monads[2]. The latter approach >> is newer, provides a nicer API (though that is admittedly >> subjective), and is faster due to some clever implementation tricks. >> However, even freer-based EE libraries are significantly slower than >> mtl style because the way effect handlers are implemented as >> ordinary functions defeats the inliner in ways mtl style does not. >> >> That said, this cost is in (>>=), which I find is often (usually?) >> insignificant compared to other costs, so while mtl spanks EE in >> microbenchmarks, I did not find a meaningful performance difference >> between mtl style and freer-based EE in real-world applications. >> >> - In my personal experience (with an admittedly very small sample >> size), novice Haskellers find defining new effects with the >> freer-simple EE library monumentally easier than with mtl style, the >> latter of which requires a deep understanding of monad transformers, >> mtl “lifting instances”, and things like newtype deriving or default >> signatures. (More on freer-simple later.) >> >> - The ecosystem of EE libraries is a mess. There are >> extensible-effects, freer, freer-effects, freer-simple, and others. >> As far as I can tell, extensible-effects is based on free monads, >> and freer and freer-effects are both unmaintained. >> >> My recommendation: if the performance of using EE is acceptable in your >> application AND you are willing to pay the cost of less ecosystem >> support (which in practice means needing to write adapters to mtl style >> libraries and having access to less documentation), I would strongly >> recommend the freer-simple extensible effect library. MASSIVE >> DISCLAIMER: I am the author and maintainer of freer-simple! However, I >> have a few reasons to believe I am not wholly biased: >> >> 1. I developed freer-simple only after using mtl style in production >> applications for nearly two years and thoroughly investigating the >> EE landscape. >> >> 2. I actually compared and contrasted, in practice, the difference >> in understanding between teaching mtl style, other EE libraries, >> and freer-simple to Haskell novices. >> >> 3. I have a number of satisfied customers.[3][4] >> >> The distinguishing features of freer-simple are better documentation and >> a dramatically different (and hopefully easier to understand) API for >> defining new effects compared to other extensible effects libraries. For >> details, see the freer-simple module documentation on Hackage here: >> >> https://hackage.haskell.org/package/freer-simple/docs/Control-Monad-Freer.html >> >> If you have any further questions, I’m happy to answer them, but this >> email is long enough already! Hopefully it isn’t too overwhelming. >> >> Alexis >> >> [1]: http://okmij.org/ftp/Haskell/extensible/exteff.pdf >> [2]: http://okmij.org/ftp/Haskell/extensible/more.pdf >> [3]: https://twitter.com/rob_rix/status/1034860773808459777 >> [4]: https://twitter.com/importantshock/status/1035989288708657153 >> >>> On Sep 17, 2018, at 20:15, Viktor Dukhovni wrote: >>> >>> >>> I picked up Haskell fairly recently, as a "better imperative programming >>> language" to implement highly concurrent code to survey DNSSEC and DANE >>> adoption on the Internet. The results are great, I got a DNS library, >>> network and TLS stack that provide effortless concurrency, and a decent >>> interface to Postgres in the form of the Hasql package and performance >>> is excellent. >>> >>> But I'm still a novice in functional programming, with much to learn. >>> So it is only this week that I've started to read about Algebraic effects, >>> and I curious how the Haskell community views these nowadays. >>> >>> If this is a toxic topic raised by newbies who should just Google >>> past discussions instead, feel free to say so... >>> >>> Does the below thread still sum up the situation: >>> >>> https://www.reddit.com/r/haskell/comments/3nkv2a/why_dont_we_use_effect_handlers_as_opposed_to/ >>> >>> I see Haskell now also has an Eff monad. Is it widely used? Efficient? >>> Are there other Haskell libraries that build on it as a foundation? >>> >>> One potential advantage that comes to mind with Effects is that the >>> exceptions raised by a computation can enter its signature and it >>> becomes less likely that a library will leak unexpected exception >>> types from its dependencies to its callers if the expected exceptions >>> are explicit in the signatures and checked by the type system. >>> >>> For example, a while back the Haskell Network.DNS library leaked exceptions >>> from a parser library that was an internal implementation detail, and my code >>> had rare crashes on malformed DNS packets, since I did not expect or handle >>> that exception. >>> >>> -- >>> Viktor. >> _______________________________________________ >> Haskell-Cafe mailing list >> To (un)subscribe, modify options or view archives go to: >> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe >> Only members subscribed via the mailman list are allowed to post. >> > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. From whosekiteneverfly at gmail.com Wed Sep 19 03:12:57 2018 From: whosekiteneverfly at gmail.com (Yuji Yamamoto) Date: Wed, 19 Sep 2018 12:12:57 +0900 Subject: [Haskell-cafe] Call for lightning talks and participation -- Haskell Implementors' Workshop In-Reply-To: References: Message-ID: Hello, I'm interested in giving a lightning talk about my package named substring-parser if I can successfully prepare slides and I don't feel bad (due to jet lag). But I'm concerned whether that is the topic suitable for > Anything related to Haskell implementations, fun uses of Haskell etc. because substring-parser is just a tiny library not actually related to Haskell implementation. Can you give me some advice? Thanks in advance! 2018年9月16日(日) 17:15 Joachim Breitner : > Call for Contributions > ACM SIGPLAN Haskell Implementors’ Workshop > Sunday, 23 September, 2018 > > https://icfp18.sigplan.org/track/hiw-2018-papers > > Co-located with ICFP 2018 > St. Louis, Missouri, US > https://conf.researchr.org/home/icfp-2018 > > > The Haskell Implementors Workshop is only one week away! Time to look > at our great program at > > https://icfp18.sigplan.org/track/hiw-2018-papers#program > > and plan your day! > > Lightning Talks > --------------- > > Like in the previous years, we will have slots for lightning talks. And > because they were so successful last year, we will have more! > > *Topics* > > Anything related to Haskell implementations, fun uses of Haskell etc. > goes. Feel free to tell us about ongoing work, to entertain, to rant, > to stir a debate! > > (If you ever have been to a security or crypto conference, you might > have attended their “rump session”. While there will not be alcohol > involved at HIW, I hope that we can still match their creativity and > insightful fun.) > > *Rules* > > * There are 3 sets of 3 lightning talks. > * Sign-up is on day of the event, in person, on paper. No prior > registration possible. > * Lightning talks are 8 mins or less. > If you know that your lightning talk takes less time, please say so, > and maybe we can put four lightning talks into the slot. > * Lightning talks do not count as peer-reviewed publications and are > not published in the conference proceedings. > > Program Committee > ----------------- > > * Edwin Brady (University of St. Andrews, UK) > * Joachim Breitner – chair (DFINITY / University of Pennsylvania) > * Ben Gamari (Well-Typed LLP) > * Michael Hanus (Kiel University) > * Roman Leshchinsky (Facebook) > * Niki Vazou (University of Maryland) > > Contact > ------- > > * Joachim Breitner > > > -- > Joachim Breitner > former post-doctoral researcher > > http://cis.upenn.edu/~joachim > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -- 山本悠滋 twitter: @igrep GitHub: https://github.com/igrep GitLab: https://gitlab.com/igrep Facebook: http://www.facebook.com/igrep Google+: https://plus.google.com/u/0/+YujiYamamoto_igrep -------------- next part -------------- An HTML attachment was scrubbed... URL: From tdammers at gmail.com Wed Sep 19 12:52:59 2018 From: tdammers at gmail.com (Tobias Dammers) Date: Wed, 19 Sep 2018 14:52:59 +0200 Subject: [Haskell-cafe] Parsing LocalTime from Unix seconds In-Reply-To: <65AB0B5C-2ACD-4DE4-A5A6-4491DBB4B8DC@dukhovni.org> References: <55AA829D-D879-4C74-AAF9-FD72EB269A37@aatal-apotheke.de> <20180917060129.jbanodarfpurpdce@nibbler> <65AB0B5C-2ACD-4DE4-A5A6-4491DBB4B8DC@dukhovni.org> Message-ID: <20180919125258.7fqogdpvulgwwaru@nibbler> On Mon, Sep 17, 2018 at 09:12:38AM -0400, Viktor Dukhovni wrote: > > > > On Sep 17, 2018, at 2:01 AM, Tobias Dammers wrote: > > > > Also, unix time may represent either actual seconds elapsed since epoch, or > > "logical" seconds since epoch (ignoring leap seconds, such that midnight > > is always a multiple of 86400 seconds). > > That would be a violation of the specification: > > http://pubs.opengroup.org/onlinepubs/9699919799/basedefs/V1_chap04.html#tag_04_16 Ah yes, you are right. Problem is, while Unix systems to day use Unix time this way, not all other software systems do, or they may assume that "current unix timestamp - some unix timestamp in the past = exact number of seconds elapsed between timestamps". Which doesn't hold in the presence of leap seconds, no matter how you implement them. In other words, people use Unix timestamps (both the word and actual implementations) sloppily. From dominik.schrempf at gmail.com Wed Sep 19 14:28:32 2018 From: dominik.schrempf at gmail.com (Dominik Schrempf) Date: Wed, 19 Sep 2018 16:28:32 +0200 Subject: [Haskell-cafe] Typeclassopedia for numbers Message-ID: <8736u536jz.fsf@gmail.com> Hello, the Typeclassopedia[1] lists standard Haskell (algebraic ?) type classes and their relations. I was wondering if a similar construct also exists for numeric type classes (and probably also their instances), since I am always struggling with how, e.g., 'Integral' number are related to 'Fractional' and so on and so forth. Thanks, Dominik [1] https://wiki.haskell.org/Typeclassopedia From ietf-dane at dukhovni.org Wed Sep 19 15:24:18 2018 From: ietf-dane at dukhovni.org (Viktor Dukhovni) Date: Wed, 19 Sep 2018 11:24:18 -0400 Subject: [Haskell-cafe] Parsing LocalTime from Unix seconds In-Reply-To: <20180919125258.7fqogdpvulgwwaru@nibbler> References: <55AA829D-D879-4C74-AAF9-FD72EB269A37@aatal-apotheke.de> <20180917060129.jbanodarfpurpdce@nibbler> <65AB0B5C-2ACD-4DE4-A5A6-4491DBB4B8DC@dukhovni.org> <20180919125258.7fqogdpvulgwwaru@nibbler> Message-ID: > On Sep 19, 2018, at 8:52 AM, Tobias Dammers wrote: > >>> Also, unix time may represent either actual seconds elapsed since epoch, or >>> "logical" seconds since epoch (ignoring leap seconds, such that midnight >>> is always a multiple of 86400 seconds). >> >> That would be a violation of the specification: >> >> http://pubs.opengroup.org/onlinepubs/9699919799/basedefs/V1_chap04.html#tag_04_16 > > Ah yes, you are right. Problem is, while Unix systems to day use Unix > time this way, not all other software systems do, or they may assume > that "current unix timestamp - some unix timestamp in the past = > exact number of seconds elapsed between timestamps". Which doesn't hold > in the presence of leap seconds, no matter how you implement them. In > other words, people use Unix timestamps (both the word and actual > implementations) sloppily. That may be so, but Haskell's libraries should not cater to broken implementations. UNIX epoch time is the time elapsed since 19700101T00:00:00+0000 - number of leap seconds added since 19700101T00:00:00+0000. In principle this means that the epoch time repeats during the leap second. In practice smearing is the better approach taken by many systems with the clock remaining monotone, but briefly running more slowly (for added leap seconds). So you're not guaranteed precise synchronization with any particular clock, but there is a deterministic conversion from epoch time to local time that does not need to concern itself with leap seconds. This conversion will never produce a 60th second, but could produce a 59th second that never existed if there are ever negative leap seconds. -- Viktor. From emil.a.hammarstrom at gmail.com Wed Sep 19 19:47:36 2018 From: emil.a.hammarstrom at gmail.com (=?UTF-8?Q?Emil_Hammarstr=C3=B6m?=) Date: Wed, 19 Sep 2018 21:47:36 +0200 Subject: [Haskell-cafe] Typeclassopedia for numbers In-Reply-To: <8736u536jz.fsf@gmail.com> References: <8736u536jz.fsf@gmail.com> Message-ID: Hi Dominik, Is the hierarchy shown on page 7 on the slides linked below what you are looking for? http://fileadmin.cs.lth.se/cs/Education/EDAN40/lectures/Types.4.pdf On Wed, Sep 19, 2018 at 4:29 PM Dominik Schrempf wrote: > Hello, > > the Typeclassopedia[1] lists standard Haskell (algebraic ?) type classes > and > their relations. I was wondering if a similar construct also exists for > numeric > type classes (and probably also their instances), since I am always > struggling > with how, e.g., 'Integral' number are related to 'Fractional' and so on > and so > forth. > > Thanks, > Dominik > > [1] https://wiki.haskell.org/Typeclassopedia > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From dominik.schrempf at gmail.com Thu Sep 20 08:59:13 2018 From: dominik.schrempf at gmail.com (Dominik Schrempf) Date: Thu, 20 Sep 2018 10:59:13 +0200 Subject: [Haskell-cafe] Typeclassopedia for numbers In-Reply-To: References: <8736u536jz.fsf@gmail.com> Message-ID: <87y3bwtuhq.fsf@gmail.com> Hi, yes, that is exactly what I was looking for, thank you. Can we include this figure into the Wiki? If we cannot include the figure directly, I could produce a similar plot, but I am not an expert in this issue. Dominik Emil Hammarström writes: > Hi Dominik, > > Is the hierarchy shown on page 7 on the slides linked below what you are > looking for? > > http://fileadmin.cs.lth.se/cs/Education/EDAN40/lectures/Types.4.pdf > > On Wed, Sep 19, 2018 at 4:29 PM Dominik Schrempf > wrote: > >> Hello, >> >> the Typeclassopedia[1] lists standard Haskell (algebraic ?) type classes >> and >> their relations. I was wondering if a similar construct also exists for >> numeric >> type classes (and probably also their instances), since I am always >> struggling >> with how, e.g., 'Integral' number are related to 'Fractional' and so on >> and so >> forth. >> >> Thanks, >> Dominik >> >> [1] https://wiki.haskell.org/Typeclassopedia >> _______________________________________________ >> Haskell-Cafe mailing list >> To (un)subscribe, modify options or view archives go to: >> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe >> Only members subscribed via the mailman list are allowed to post. From emil.a.hammarstrom at gmail.com Thu Sep 20 09:14:45 2018 From: emil.a.hammarstrom at gmail.com (=?UTF-8?Q?Emil_Hammarstr=C3=B6m?=) Date: Thu, 20 Sep 2018 11:14:45 +0200 Subject: [Haskell-cafe] Typeclassopedia for numbers In-Reply-To: <87y3bwtuhq.fsf@gmail.com> References: <8736u536jz.fsf@gmail.com> <87y3bwtuhq.fsf@gmail.com> Message-ID: The plot can be found in both the Haskell 98 and 2010 report in section 6.3. On Thu, Sep 20, 2018, 10:59 Dominik Schrempf wrote: > Hi, > > yes, that is exactly what I was looking for, thank you. Can we include this > figure into the Wiki? If we cannot include the figure directly, I could > produce > a similar plot, but I am not an expert in this issue. > > Dominik > > Emil Hammarström writes: > > > Hi Dominik, > > > > Is the hierarchy shown on page 7 on the slides linked below what you are > > looking for? > > > > http://fileadmin.cs.lth.se/cs/Education/EDAN40/lectures/Types.4.pdf > > > > On Wed, Sep 19, 2018 at 4:29 PM Dominik Schrempf < > dominik.schrempf at gmail.com> > > wrote: > > > >> Hello, > >> > >> the Typeclassopedia[1] lists standard Haskell (algebraic ?) type classes > >> and > >> their relations. I was wondering if a similar construct also exists for > >> numeric > >> type classes (and probably also their instances), since I am always > >> struggling > >> with how, e.g., 'Integral' number are related to 'Fractional' and so on > >> and so > >> forth. > >> > >> Thanks, > >> Dominik > >> > >> [1] https://wiki.haskell.org/Typeclassopedia > >> _______________________________________________ > >> Haskell-Cafe mailing list > >> To (un)subscribe, modify options or view archives go to: > >> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > >> Only members subscribed via the mailman list are allowed to post. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From lanablack at amok.cc Thu Sep 20 14:29:02 2018 From: lanablack at amok.cc (Lana Black) Date: Thu, 20 Sep 2018 14:29:02 +0000 Subject: [Haskell-cafe] Algebraic Effects? In-Reply-To: References: <038D0B86-6697-4ECB-9F36-53D9175B4D10@dukhovni.org> Message-ID: On 9/18/18 6:06 AM, Alexis King wrote: > - The ecosystem of EE libraries is a mess. There are > extensible-effects, freer, freer-effects, freer-simple, and others. > As far as I can tell, extensible-effects is based on free monads, > and freer and freer-effects are both unmaintained. extensible-effects is also based on freer monads. One advantage over freer-simple and other libraries is that it includes monad-control instances for most used effects, allowing to mostly painlessly use things like forkIO. The limitations of monad-control apply though. Huge disclaimer: I'm one of the contributors to extensible-effects. From olf at aatal-apotheke.de Thu Sep 20 19:35:35 2018 From: olf at aatal-apotheke.de (Olaf Klinke) Date: Thu, 20 Sep 2018 21:35:35 +0200 Subject: [Haskell-cafe] Origin of monad transformers (was: Algebraic Effects?) Message-ID: <405D9D66-EEB1-4794-A7ED-380E4FDD6007@aatal-apotheke.de> > The only monads with distributive laws are in Data.Distributive, and > they're all isomorphic to (->) e for some e. So that's effectively > ReaderT. That is true for this particular signature of distributive law, and I find this a fascinating result. Especially because there are Haskell terms witnessing this categorical fact. In [1, Theorem 2.4.2] it is stated (as I understand it) that if two monads H and K have a distributive law, then the first is a functor on the Kleisli category of the second while the second is a functor on the Eilenberg-Moore category of the first. In the same paper, Section 4 describes monads on Set (including e.g. trees, bags, Maybe) which distribute over all commutative monads. But maybe one does not need such heavy categorical machinery to describe where monad transformers in Haskell come from. Consider the following. There are at least three kinds of monad transformers t in mtl: (1) t m a = m (t' a) for some monad t' (e.g. ExceptT) (2) t m a = t' (m a) for some monad t' (e.g. ReaderT) (3) t m a = g (m (f a)) for some functors g and f. (e.g. StateT) Only (2) is of the Data.Distributive kind, but all three can be unified under the following scheme: Lemma: If F ⊣ G is an adjunction where F: C → D and G: D → C are functors, and if M: D → D is a monad on D, then G∘M∘F is a monad. Proof: One can factor M through the Kleisli category of M L ⊣ R where L: D → Kleisli(M) and R: Kleisli(M) → D. Since adjoints compose, we have L∘F ⊣ G∘R and G∘R∘L∘F = G∘M∘F. q.e.d. There is another interesting paper [2] showing that for certain class of monads, checking the conditions for a distributive law is easier than the general case. Olaf [1] http://emis.ams.org/journals/TAC/volumes/18/7/18-07.pdf [2] https://doi.org/10.1016/S0022-4049(01)00097-4 From lexi.lambda at gmail.com Thu Sep 20 20:45:09 2018 From: lexi.lambda at gmail.com (Alexis King) Date: Thu, 20 Sep 2018 15:45:09 -0500 Subject: [Haskell-cafe] Algebraic Effects? In-Reply-To: <20C39866-9BBD-474E-B5F0-7F7B7861FED4@dukhovni.org> References: <038D0B86-6697-4ECB-9F36-53D9175B4D10@dukhovni.org> <20C39866-9BBD-474E-B5F0-7F7B7861FED4@dukhovni.org> Message-ID: <43AF598F-CAAF-46B9-8B18-C19A63632FF9@gmail.com> > On Sep 18, 2018, at 10:49, Viktor Dukhovni wrote: > > I took a quick look at: > > https://hackage.haskell.org/package/extensible-0.4.10. > > The author claims good performance: > > https://www.schoolofhaskell.com/user/fumieval/extensible/the-world-s-fastest-extensible-effects-framework That’s an interesting post. I am not totally sure why precisely it would be faster than freer/freer-effects/freer-simple, but it’s worth looking into to see if its optimizations can be incorporated into freer-simple. In any case, it seems like it’s “only” faster by a factor of ~2x, while the difference in performance between mtl and extensible is a factor of ~12x, so the difference is not massive. Still, it’s worth investigating. > With this library, at least when building effects out of mtl transformers, > the order of the effects in the Eff type declaration has to match in > reverse order the composition of the "runFooDef" functions. That is, > the types: > > Eff '[ReaderDef r, StateDef s] > and > Eff '[StateDef s, ReaderDef r] > > are not the same. Perhaps this is a feature? Generally, when working with extensible effect frameworks, you do not work with concrete lists of effects, since those two types are indeed different. Instead, you write functions polymorphic over the set of effects, and you include constraints on what must be somewhere in the list. Therefore, you’d express your example like this in freer-simple: Members '[Reader r, State s] effs => Eff effs a This constraint expresses that `Reader r` and `State s` must both exist somewhere in the list of effects `eff`, but it doesn’t matter where they are or what order they’re in. You only pick a concrete set of effects when finally running them. In a sense, this is similar, but not quite equivalent to, the difference between using transformers directly and using mtl typeclasses. Specifically, I mean the difference between these three types: ReaderT r (State s) a StateT s (Reader r) a (MonadReader r m, MonadState s m) => m a The first two specify a concrete transformer stack, which specifies both the order of the transformers and the whole contents of the stack. The mtl classes parameterize over the precise transformer stack, which enables easier composition. However, in a sense, a use of Eff with a concrete list is sort of in between these two types — since each effect can still be handled with an arbitrary effect handler, a concrete list enforces effect order relative to other effects, but it doesn’t enforce which handler must be used to handle each effect. The relationship between those things is largely unimportant when actually using extensible effects, however. Just use the `Member` or `Members` constraints instead of specifying a concrete list, and you’ll be alright. > Thanks. I'll take a look. Any comments on similarities to or > differences from the "extensible" package above? I am not intimately familiar with the extensible package, but it has a much broader scope than simply implementing an extensible effects library: it also implements open records/sums, and a whole bunch of other things (of, in my opinion, questionable usefulness). As for a comparison between exclusively the effects subset of extensible and freer-simple, extensible seems to have a very different API, which labels each effect in an effect stack. Personally, I am skeptical of extensible’s API. It seems to make some tradeoffs with benefits that I can’t imagine are worth the costs. Still, I haven’t used it in practice, so I can’t seriously judge it. > One concern for me is whether using Effects is likely to cause more allocations > and work for the GC, or is the memory pressure about the same? It’s hard to say. My conservative answer is “yes, extensible-effects will allocate more”, since extensible effects systems reify computations as data structures by design, and sadly, GHC isn’t clever enough to eliminate those data structures even when effect handlers are used in such a way that all effects can be statically determined. In contrast, the GHC optimizer is really good at optimizing away the dictionaries inserted by mtl-style typeclasses. That said, I don’t feel like I can really say for sure without being intimately familiar with a given program. The problem with things like extensible-effects is that microbenchmarks are always going to be misleading; the performance characteristics (relative to mtl style, anyway) can vary wildly depending on how they’re used in practice. I think this is an area where you just have to benchmark and see. > On Sep 20, 2018, at 09:29, Lana Black wrote: > > extensible-effects is also based on freer monads. One advantage over freer-simple and other libraries is that it includes monad-control instances for most used effects, allowing to mostly painlessly use things like forkIO. The limitations of monad-control apply though. Thanks for the correction. I peeked at the documentation, and it looks like extensible-effects switched to using freer in November of last year, which sounds about right, since I was looking into extensible effects libraries last July, prior to the switch. I guess I’m a little behind the times. The MonadBaseControl instances are neat, though they seem to be rather limited in that they hardcode particular effect handlers. I don’t know of a way around that restriction, but it does seem potentially misleading to me. In any case, the effort of implementing MonadBaseControl instances is still quite high, and since it still needs to be done for user-defined effects, I’m not sure how useful it would be in practice (as when I use either mtl style or extensible effects, I define a lot of my own effects rather than only using the ones provided by the library). Alexis From aditya.siram at gmail.com Fri Sep 21 11:13:55 2018 From: aditya.siram at gmail.com (aditya siram) Date: Fri, 21 Sep 2018 06:13:55 -0500 Subject: [Haskell-cafe] Suppress re-exported module's docs ... Message-ID: Is there any way to have Haddock ignore a re-exported module without hiding it? eg. here all of `B`'s exports/docs show up in `A`'s Haddock page and I don't want to see them there: module A ( ... module B ) import B ... -------------- next part -------------- An HTML attachment was scrubbed... URL: From hvriedel at gmail.com Sat Sep 22 08:32:06 2018 From: hvriedel at gmail.com (Herbert Valerio Riedel) Date: Sat, 22 Sep 2018 10:32:06 +0200 Subject: [Haskell-cafe] [ANNOUNCE] GHC 8.6.1 released In-Reply-To: <87wore1h9i.fsf@smart-cactus.org> (Ben Gamari's message of "Fri, 21 Sep 2018 20:57:02 -0400") References: <87wore1h9i.fsf@smart-cactus.org> Message-ID: <875zyyhr09.fsf@gmail.com> Hello everyone, Here's an addendum to the announcment as it ommitted an important detail: GHC 8.6.1 is only guaranteed to work properly with tooling which uses lib:Cabal version 2.4.0.1 or later. As such, GHC 8.6.1 works best with ​`cabal-install` 2.4.0.0 or later; please upgrade to `cabal-install` 2.4.0.0 if you haven't already. Note that cabal-install 2.4 supports all GHC versions back till GHC 7.0.4 and we also strongly recommend to use the latest available stable release of `cabal` even with older GHC releases as bugfixes and improvements aren't always backported to older Cabal releases as well as to be able to benefit from recently added CABAL format features[8] (or be able to access package releases on Hackage[9] which rely on those features) which require recent enough versions of Cabal as well. Note that binaries aren't available on cabal's download page[1] yet. If you're on Ubuntu or Debian, you can get a compiled cabal-install 2.4 `.deb` package via Apt from - https://launchpad.net/~hvr/+archive/ubuntu/ghc or - http://downloads.haskell.org/debian/ respectively. Binary versions for macOS and Windows are also expected to become available via [2] and [3] soon (and also at [1]). In the meantime, if you already have GHC 7.10 or later (together with a compatible `cabal` executable) installed, you can easily install cabal 2.4 yourself from Hackage[9] by invoking cabal install cabal-install-2.4.0.0 and making sure that the resulting `cabal` executable is accessible via your $PATH; you can check with `cabal --version` which should emit something along the lines of $ cabal --version cabal-install version 2.4.0.0 compiled using version 2.4.0.1 of the Cabal library Finally, the Haskell Platform[4] release for GHC 8.6.1 should be available soon as well which provides yet another recommended "standard way to get GHC and related tools"[5] in a uniform way across multiple operating systems. See [4] and [5] for more details about the standard Haskell Platform distribution. [1]: https://www.haskell.org/cabal/download.html [2]: https://haskell.futurice.com/ [3]: https://hub.zhox.com/posts/chocolatey-introduction/ [4]: https://www.haskell.org/platform/ [5]: https://mail.haskell.org/pipermail/ghc-devs/2015-July/009379.html [6]: https://launchpad.net/~hvr/+archive/ubuntu/ghc [7]: http://downloads.haskell.org/debian/ [8]: https://cabal.readthedocs.io/en/latest/file-format-changelog.html [9]: http://hackage.haskell.org/ -- Herbert On 2018-09-21 at 20:57:02 -0400, Ben Gamari wrote: > Hello everyone, > > The GHC team is pleased to announce the availability of GHC 8.6.1, the > fourth major release in the GHC 8 series. The source distribution, binary > distributions, and documentation for this release are available at > > https://downloads.haskell.org/~ghc/8.6.1 > > The 8.6 release fixes over 400 bugs from the 8.4 series and introduces a > number of exciting features. These most notably include: > > * A new deriving mechanism, `deriving via`, providing a convenient way > for users to extend Haskell's typeclass deriving mechanism > > * Quantified constraints, allowing forall quantification in constraint contexts > > * An early version of the GHCi `:doc` command > > * The `ghc-heap-view` package, allowing introspection into the > structure of GHC's heap > > * Valid hole fit hints, helping the user to find terms to fill typed > holes in their programs > > * The BlockArguments extension, allowing the `$` operator to be omitted > in some unambiguous contexts > > * An exciting new plugin mechanism, source plugins, allowing plugins to > inspect and modify a wide variety of compiler representations. > > * Improved recompilation checking when plugins are used > > * Significantly better handling of macOS linker command size limits, > avoiding linker errors while linking large projects > > * The next phase of the MonadFail proposal, enabling > -XMonadFailDesugaring by default > > A full list of the changes in this release can be found in the > release notes: > > https://downloads.haskell.org/~ghc/8.6.1/docs/html/users_guide/8.6.1-notes.html > > Perhaps of equal importance, GHC 8.6 is the second major release made > under GHC's accelerated six-month release schedule and the first set of > binary distributions built primarily using our new continuous > integration scheme. While the final 8.6 release is around three weeks > later than initially scheduled due to late-breaking bug reports, we > expect that the 8.8 release schedule shouldn't be affected. > > Thanks to everyone who has contributed to developing, documenting, and > testing this release! > > As always, let us know if you encounter trouble. > > > How to get it > ~~~~~~~~~~~~~ > > The easy way is to go to the web page, which should be self-explanatory: > > https://www.haskell.org/ghc/ > > We supply binary builds in the native package format for many > platforms, and the source distribution is available from the same > place. > > Packages will appear as they are built - if the package for your > system isn't available yet, please try again later. > > > Background > ~~~~~~~~~~ > > Haskell is a standard lazy functional programming language. > > GHC is a state-of-the-art programming suite for Haskell. Included is > an optimising compiler generating efficient code for a variety of > platforms, together with an interactive system for convenient, quick > development. The distribution includes space and time profiling > facilities, a large collection of libraries, and support for various > language extensions, including concurrency, exceptions, and foreign > language interfaces. GHC is distributed under a BSD-style open source license. > > A wide variety of Haskell related resources (tutorials, libraries, > specifications, documentation, compilers, interpreters, references, > contact information, links to research groups) are available from the > Haskell home page (see below). > > > On-line GHC-related resources > ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ > > Relevant URLs on the World-Wide Web: > > GHC home page https://www.haskell.org/ghc/ > GHC developers' home page https://ghc.haskell.org/trac/ghc/ > Haskell home page https://www.haskell.org/ > > > Supported Platforms > ~~~~~~~~~~~~~~~~~~~ > > The list of platforms we support, and the people responsible for them, > is here: > > https://ghc.haskell.org/trac/ghc/wiki/Contributors > > Ports to other platforms are possible with varying degrees of > difficulty. The Building Guide describes how to go about porting to a > new platform: > > https://ghc.haskell.org/trac/ghc/wiki/Building > > > Developers > ~~~~~~~~~~ > > We welcome new contributors. Instructions on accessing our source > code repository, and getting started with hacking on GHC, are > available from the GHC's developer's site run by Trac: > > https://ghc.haskell.org/trac/ghc/ > > > Mailing lists > ~~~~~~~~~~~~~ > > We run mailing lists for GHC users and bug reports; to subscribe, use > the web interfaces at > > https://mail.haskell.org/cgi-bin/mailman/listinfo/glasgow-haskell-users > https://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-tickets > > There are several other haskell and ghc-related mailing lists on > www.haskell.org; for the full list, see > > https://mail.haskell.org/cgi-bin/mailman/listinfo > > Some GHC developers hang out on #haskell on IRC, too: > > https://www.haskell.org/haskellwiki/IRC_channel > > Please report bugs using our bug tracking system. Instructions on > reporting bugs can be found here: > > https://www.haskell.org/ghc/reportabug From leah at vuxu.org Sat Sep 22 12:34:38 2018 From: leah at vuxu.org (Leah Neukirchen) Date: Sat, 22 Sep 2018 14:34:38 +0200 Subject: [Haskell-cafe] Munich Haskell Meeting, 2018-09-24 @ 19:30 Message-ID: <87a7o9lnhd.fsf@vuxu.org> Dear all, Next week, our monthly Munich Haskell Meeting will take place again on Monday, September 24 at Cafe Puck at 19h30. For details see here: http://muenchen.haskell.bayern/dates.html If you plan to join, please add yourself to this dudle so we can reserve enough seats! It is OK to add yourself to the dudle anonymously or pseudonymously. https://dudle.inf.tu-dresden.de/haskell-munich-sep-2018/ Everybody is welcome! cu, -- Leah Neukirchen http://leah.zone From albert+haskell at zeitkraut.de Mon Sep 24 07:07:17 2018 From: albert+haskell at zeitkraut.de (Albert Krewinkel) Date: Mon, 24 Sep 2018 09:07:17 +0200 Subject: [Haskell-cafe] ANN: hslua-1.0.0 Message-ID: <87o9cn8jbu.fsf@zeitkraut.de> HsLua provides bindings, wrappers, types, and helper functions to bridge Haskell and the embeddable scripting language Lua. The package ships with a Lua interpreter. HsLua allows to make Haskell programs extensible and scriptable via a widely used language. I am pleased to announce the release of hslua-1.0.0. Changes have been made to most parts of the library: Error handling became simpler, as conversion between Haskell exceptions and Lua errors is now automatic; other improvements concern type-safety, stability, and ease of use. Please see the changelog for the complete list of changes: New versions of the extension packages hslua-aeson (marshalling via aeson) and hslua-module-text (a UTF-8 aware string module) have been released as well. For additional information and usage notes: Best, - Albert From harendra.kumar at gmail.com Mon Sep 24 10:42:23 2018 From: harendra.kumar at gmail.com (Harendra Kumar) Date: Mon, 24 Sep 2018 16:12:23 +0530 Subject: [Haskell-cafe] clonetype Message-ID: Often, we need to create a newtype that is equivalent to a given type for safety reasons. Using type synonym is useless from type safety perspective. With newtype, we have to add a "deriving" clause to it for deriving the required instances, to make it practically useful. Does it make sense, and is it possible to have something like a "clonetype" that creates a new type and derives all the instances of the parent type as well? It will be quite helpful in creating equivalent newtype synonyms quickly. Almost always, I do not use a newtype where I should just because of the inconvenience of deriving the instances. Ideally, we should just be able to say something like: clonetype MyString = String and we are good to go. What is the shortest possible way to achieve this with currently available mechanisms, if any? -harendra -------------- next part -------------- An HTML attachment was scrubbed... URL: From asandroq at gmail.com Mon Sep 24 11:18:10 2018 From: asandroq at gmail.com (Alex Silva) Date: Mon, 24 Sep 2018 13:18:10 +0200 Subject: [Haskell-cafe] clonetype In-Reply-To: References: Message-ID: <458d6f20-a866-d56d-4f83-2ffec3bc03c2@gmail.com> Hi, On 24/09/18 12:42, Harendra Kumar wrote: > > and we are good to go.  What is the shortest possible way to achieve > this with currently available mechanisms, if any? > What about GeneralizedNewtypeDeriving[1]? [1] https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/glasgow_exts.html#generalised-derived-instances-for-newtypes Cheers, -- -alex https://unendli.ch/ From oleg.grenrus at iki.fi Mon Sep 24 12:47:29 2018 From: oleg.grenrus at iki.fi (Oleg Grenrus) Date: Mon, 24 Sep 2018 15:47:29 +0300 Subject: [Haskell-cafe] clonetype In-Reply-To: References: Message-ID: The problem is that "All instances" is hard to pin point. We have open world assumption, so instances can be added later (in the dependency tree). Should they be cloned too? And even of you restrict to "instances visible at clonetype definition", that's IMHO not a good idea either, as it's implicit and volatile set (editing imports changes may change the set). So use either tagged-trick/pattern or GND. Haskell's heavy type machinery exists so we can explicitly and exactly say what we need or want. Tagged/phantom-pattern and GND are different, they have different pros & cons. Having `clonetype-fits-all` seems non-trivial to me. Sent from my iPhone > On 24 Sep 2018, at 15.25, Harendra Kumar wrote: > > That comes close, but Haskell having such a heavy type machinery to do all the esoteric type stuff in the world, but not allowing you to express such a simple day to day programming thing in a simpler manner is not very satisfying. > > -harendra > >> On Mon, 24 Sep 2018 at 16:16, Oleg Grenrus wrote: >> data MyStringTag >> type MyString = Tagged MyStringTag String >> >> Sent from my iPhone >> >>> On 24 Sep 2018, at 13.42, Harendra Kumar wrote: >>> >>> Often, we need to create a newtype that is equivalent to a given type for safety reasons. Using type synonym is useless from type safety perspective. With newtype, we have to add a "deriving" clause to it for deriving the required instances, to make it practically useful. >>> >>> Does it make sense, and is it possible to have something like a "clonetype" that creates a new type and derives all the instances of the parent type as well? It will be quite helpful in creating equivalent newtype synonyms quickly. Almost always, I do not use a newtype where I should just because of the inconvenience of deriving the instances. Ideally, we should just be able to say something like: >>> >>> clonetype MyString = String >>> >>> and we are good to go. What is the shortest possible way to achieve this with currently available mechanisms, if any? >>> >>> -harendra >>> _______________________________________________ >>> Haskell-Cafe mailing list >>> To (un)subscribe, modify options or view archives go to: >>> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe >>> Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From lysxia at gmail.com Mon Sep 24 13:08:35 2018 From: lysxia at gmail.com (Li-yao Xia) Date: Mon, 24 Sep 2018 09:08:35 -0400 Subject: [Haskell-cafe] clonetype In-Reply-To: References: Message-ID: <8db5b23c-e0a5-46d5-c7d2-7739fd6f9835@gmail.com> Another issue with saying "derive all instances for this newtype" is that not all instances are coercible. type family F a where F Char = Int class C a where f :: a -> F a instance C Char where f _ = 'a' newtype D = D Char There is no way to derive an instance C D. A more explicit and flexible solution would be "deriving synonyms", discussed recently here: https://www.reddit.com/r/haskell/comments/9dx6s9/proposal_data_deriving_synonyms/ Li-yao On 9/24/18 8:47 AM, Oleg Grenrus wrote: > The problem is that "All instances" is hard to pin point. We have open > world assumption, so instances can be added later (in the dependency > tree). Should they be cloned too? And even of you restrict to "instances > visible at clonetype definition", that's IMHO not a good idea either, as > it's implicit and volatile set (editing imports changes may change the set). > > So use either tagged-trick/pattern or GND. > > Haskell's heavy type machinery exists so we can explicitly and exactly > say what we need or want. Tagged/phantom-pattern and GND are different, > they have different pros & cons. Having `clonetype-fits-all` seems > non-trivial to me. > > Sent from my iPhone > > On 24 Sep 2018, at 15.25, Harendra Kumar > wrote: > >> That comes close, but Haskell having such a heavy type machinery to do >> all the esoteric type stuff in the world, but not allowing you to >> express such a simple day to day programming thing in a simpler manner >> is not very satisfying. >> >> -harendra >> >> On Mon, 24 Sep 2018 at 16:16, Oleg Grenrus > > wrote: >> >> data MyStringTag >> type MyString = Tagged MyStringTag String >> >> Sent from my iPhone >> >> On 24 Sep 2018, at 13.42, Harendra Kumar > > wrote: >> >>> Often, we need to create a newtype that is equivalent to a given >>> type for safety reasons. Using type  synonym is useless from type >>> safety perspective. With newtype, we have to add a "deriving" >>> clause to it for deriving the required instances, to make it >>> practically useful. >>> >>> Does it make sense, and is it possible to have something like a >>> "clonetype" that creates a new type and derives all the instances >>> of the parent type as well? It will be quite helpful in creating >>> equivalent newtype synonyms quickly. Almost always, I do not use >>> a newtype where I should just because of the inconvenience of >>> deriving the instances. Ideally, we should just be able to say >>> something like: >>> >>> clonetype MyString = String >>> >>> and we are good to go.  What is the shortest possible way to >>> achieve this with currently available mechanisms, if any? >>> >>> -harendra >>> _______________________________________________ >>> Haskell-Cafe mailing list >>> To (un)subscribe, modify options or view archives go to: >>> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe >>> Only members subscribed via the mailman list are allowed to post. >> > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. > From mats.rauhala at gmail.com Mon Sep 24 13:09:13 2018 From: mats.rauhala at gmail.com (Mats Rauhala) Date: Mon, 24 Sep 2018 16:09:13 +0300 Subject: [Haskell-cafe] [JOB] Haskell job opportunity at RELEX Solutions in Helsinki, Finland Message-ID: <20180924130913.v73itjuqzhb7aay2@peitto> Would you like to get paid for writing Haskell? And do you live in Europe, preferably in Helsinki, Finland? If yes, please read on. RELEX Solutions is Europe's fastest growing provider of integrated retail and supply chain planning solutions. We are looking for a few Haskell developers to join our small, but impactful group of Haskellers to develop our internal tools. In this position you will be exposed to various technologies, such as Haskell, Servant, Elm, GHCJS, Terraform, Ansible, Nix, RHEL/CentOS, AWS etc. Your own interests are influential in what your daily work will consist of. Your main responsiblities will be some mix of development, operations and support of and around our coordination system. Interested in applying? Follow the link: https://relex.recruiterbox.com/jobs/fk01gjr/ Full remote for this position is negotiable, but you should be able to be present at the Helsinki office at least for a couple of weeks, preferably longer. Details of your contract are largely negotiable. Please note that the available position will be filled as soon as suitable candidates are found, so don't wait too long ;). I'm also available for questions: - 'masser' at functionalprogramming slack - 'MasseR' at freenode - 'mats.rauhala at gmail.com' email or on this mailing list. -- Mats Rauhala From manny at fpcomplete.com Mon Sep 24 13:13:51 2018 From: manny at fpcomplete.com (Emanuel Borsboom) Date: Mon, 24 Sep 2018 06:13:51 -0700 Subject: [Haskell-cafe] ANN: stack-1.9 release candidate Message-ID: This is the first release candidate for stack-1.9. Binaries for supported platforms are available here: https://github.com/commercialhaskell/stack/releases/tag/v1.9.0.1 ## Changes since v1.7.1 Release notes: * Statically linked Linux bindists are back again, thanks to [@nh2](https://github.com/nh2). **Please try the `stack-1.9.0.1-linux-x86_64-static.tar.gz` distribution if using Linux**, since it will be the default installed version once v1.9.1 is released. * We will be deleting the Ubuntu, Debian, CentOS, Fedora, and Arch package repos from `download.fpcomplete.com` soon. These have been deprecated for over a year and have not received new releases, but were left in place for compatibility with older scripts. Major changes: * `GHCJS` support is being downgraded to 'experimental'. A warning notifying the user of the experimental status of `GHCJS` will be displayed. Behavior changes: * `ghc-options` from `stack.yaml` are now appended to `ghc-options` from `config.yaml`, whereas before they would be replaced. * `stack build` will now announce when sublibraries of a package are being build, in the same way executables, tests, benchmarks and libraries are announced * `stack sdist` will now announce the destination of the generated tarball, regardless of whether or not it passed the sanity checks * The `--upgrade-cabal` option to `stack setup` has been deprecated. This feature no longer works with GHC 8.2 and later. Furthermore, the reason for this flag originally being implemented was drastically lessened once Stack started using the snapshot's `Cabal` library for custom setups. See: [#4070](https://github.com/commercialhaskell/stack/issues/4070). * With the new namespaced template feature, `stack templates` is no longer able to meaningfully display a list of all templates available. Instead, the command will download and display a [help file](https://github.com/commercialhaskell/stack-templates/blob/master/STACK_HELP.md) with more information on how to discover templates. See: [#4039](https://github.com/commercialhaskell/stack/issues/4039) * Build tools are now handled in a similar way to `cabal-install`. In particular, for legacy `build-tools` fields, we use a hard-coded list of build tools in place of looking up build tool packages in a tool map. This both brings Stack's behavior closer into line with `cabal-install`, avoids some bugs, and opens up some possible optimizations/laziness. See: [#4125](https://github.com/commercialhaskell/stack/issues/4125). * Mustache templating is not applied to large files (over 50kb) to avoid performance degredation. See: [#4133](https://github.com/commercialhaskell/stack/issues/4133). * `stack upload` signs the package by default, as documented. `--no-signature` turns the signing off. [#3739](https://github.com/commercialhaskell/stack/issues/3739) * In case there is a network connectivity issue while trying to download a template, stack will check whether that template had been downloaded before. In that case, the cached version will be used. See [#3850](https://github.com/commercialhaskell/stack/issues/3739). Other enhancements: * On Windows before Windows 10, --color=never is the default on terminals that can support ANSI color codes in output only by emulation * On Windows, recognise a 'mintty' (false) terminal as a terminal, by default * `stack build` issues a warning when `base` is explicitly listed in `extra-deps` of `stack.yaml` * `stack build` suggests trying another GHC version should the build plan end up requiring unattainable `base` version. * A new sub command `run` has been introduced to build and run a specified executable similar to `cabal run`. If no executable is provided as the first argument, it defaults to the first available executable in the project. * `stack build` missing dependency suggestions (on failure to construct a valid build plan because of missing deps) are now printed with their latest cabal file revision hash. See [#4068](https://github.com/commercialhaskell/stack/pull/4068). * Added new `--tar-dir` option to `stack sdist`, that allows to copy the resulting tarball to the specified directory. * Introduced the `--interleaved-output` command line option and `build.interleaved-output` config value which causes multiple concurrent builds to dump to stderr at the same time with a `packagename> ` prefix. See [#3225](https://github.com/commercialhaskell/stack/issues/3225). * The default retry strategy has changed to exponential backoff. This should help with [#3510](https://github.com/commercialhaskell/stack/issues/3510). * `stack new` now allows template names of the form `username/foo` to download from a user other than `commercialstack` on Github, and can be prefixed with the service `github:`, `gitlab:`, or `bitbucket:`. [#4039](https://github.com/commercialhaskell/stack/issues/4039) * Switch to `githash` to include some unmerged bugfixes in `gitrev` Suggestion to add `'allow-newer': true` now shows path to user config file where this flag should be put into [#3685](https://github.com/commercialhaskell/stack/issues/3685) * `stack ghci` now asks which main target to load before doing the build, rather than after * Bump to hpack 0.29.0 * With GHC 8.4 and later, Haddock is given the `--quickjump` flag. * It is possible to specify the Hackage base URL to upload packages to, instead of the default of `https://hackage.haskell.org/`, by using `hackage-base-url` configuration option. * When using Nix, if a specific minor version of GHC is not requested, the latest minor version in the given major branch will be used automatically. Bug fixes: * `stack ghci` now does not invalidate `.o` files on repeated runs, meaning any modules compiled with `-fobject-code` will be cached between ghci runs. See [#4038](https://github.com/commercialhaskell/stack/pull/4038). * `~/.stack/config.yaml` and `stack.yaml` terminating by newline * The previous released caused a regression where some `stderr` from the `ghc-pkg` command showed up in the terminal. This output is now silenced. * A regression in recompilation checking introduced in v1.7.1 has been fixed. See [#4001](https://github.com/commercialhaskell/stack/issues/4001) * `stack ghci` on a package with internal libraries was erroneously looking for a wrong package corresponding to the internal library and failing to load any module. This has been fixed now and changes to the code in the library and the sublibrary are properly tracked. See [#3926](https://github.com/commercialhaskell/stack/issues/3926). * For packages with internal libraries not depended upon, `stack build` used to fail the build process since the internal library was not built but it was tried to be registered. This is now fixed by always building internal libraries. See [#3996](https://github.com/commercialhaskell/stack/issues/3996). * `--no-nix` was not respected under NixOS * Fix a regression which might use a lot of RAM. See [#4027](https://github.com/commercialhaskell/stack/issues/4027). * Order of commandline arguments does not matter anymore. See [#3959](https://github.com/commercialhaskell/stack/issues/3959) * When prompting users about saving their Hackage credentials on upload, flush to stdout before waiting for the response so the prompt actually displays. Also fixes a similar issue with ghci target selection prompt. * If `cabal` is not on PATH, running `stack solver` now prompts the user to run `stack install cabal-install` * `stack build` now succeeds in building packages which contain sublibraries which are dependencies of executables, tests or benchmarks but not of the main library. See [#3787](https://github.com/commercialhaskell/stack/issues/3959). * Sublibraries are now properly considered for coverage reports when the test suite depends on the internal library. Before, stack was erroring when trying to generate the coverage report, see [#4105](https://github.com/commercialhaskell/stack/issues/4105). * Sublibraries are now added to the precompiled cache and recovered from there when the snapshot gets updated. Previously, updating the snapshot when there was a package with a sublibrary in the snapshot resulted in broken builds. This is now fixed, see [#4071](https://github.com/commercialhaskell/stack/issues/4071). * [#4114] Stack pretty prints error messages with proper `error` logging level instead of `warning` now. This also fixes self-executing scripts not piping plan construction errors from runhaskell to terminal (issue #3942). * Fix invalid "While building Setup.hs" when Cabal calls fail. See: [#3934](https://github.com/commercialhaskell/stack/issues/3934) * `stack upload` signs the package by default, as documented. `--no-signature` turns the signing off. [#3739](https://github.com/commercialhaskell/stack/issues/3739) From monnier at iro.umontreal.ca Mon Sep 24 14:05:11 2018 From: monnier at iro.umontreal.ca (Stefan Monnier) Date: Mon, 24 Sep 2018 10:05:11 -0400 Subject: [Haskell-cafe] clonetype References: Message-ID: > Does it make sense, and is it possible to have something like a "clonetype" > that creates a new type and derives all the instances of the parent type as > well? Presumably, the reason why you create a newtype is because you want the type system to distinguish the two types. Having automatically access to all the base type's classes may sometimes be what you want, but in other cases it hides too much of the distinction between the two types. Stefan From harendra.kumar at gmail.com Mon Sep 24 14:06:08 2018 From: harendra.kumar at gmail.com (Harendra Kumar) Date: Mon, 24 Sep 2018 19:36:08 +0530 Subject: [Haskell-cafe] clonetype In-Reply-To: References: Message-ID: On Mon, 24 Sep 2018 at 18:17, Oleg Grenrus wrote: > The problem is that "All instances" is hard to pin point. We have open > world assumption, so instances can be added later (in the dependency tree). > Should they be cloned too? And even of you restrict to "instances visible > at clonetype definition", that's IMHO not a good idea either, as it's > implicit and volatile set (editing imports changes may change the set). > A clone type says "both the types are exactly the same in all semantics except that they cannot be used interchangeably", it is just like "type" except that the types are treated as being different. The way visible instances change for the original type by editing imports, the same way they change for the clone type as well, I do not see a problem there. However, the two types may diverge if we define more instances for any of them after cloning and that may potentially be a source of confusion? > Haskell's heavy type machinery exists so we can explicitly and exactly say > what we need or want. > Mortal programmers would love to have "conveniently" added to that list :-) -harendra > -------------- next part -------------- An HTML attachment was scrubbed... URL: From oleg.grenrus at iki.fi Mon Sep 24 19:42:29 2018 From: oleg.grenrus at iki.fi (Oleg Grenrus) Date: Mon, 24 Sep 2018 22:42:29 +0300 Subject: [Haskell-cafe] clonetype In-Reply-To: References: Message-ID: <87f90d3d-061a-d16c-be90-3c969c249b4a@iki.fi> On 24.09.2018 17:06, Harendra Kumar wrote: > > > On Mon, 24 Sep 2018 at 18:17, Oleg Grenrus > wrote: > > The problem is that "All instances" is hard to pin point. We have > open world assumption, so instances can be added later (in the > dependency tree). Should they be cloned too? And even of you > restrict to "instances visible at clonetype definition", that's > IMHO not a good idea either, as it's implicit and volatile set > (editing imports changes may change the set). > > > A clone type says "both the types are exactly the same in all > semantics except that they cannot be used interchangeably", it is just > like "type" except that the types are treated as being different. The > way visible instances change for the original type by editing imports, > the same way they change for the clone type as well, I do not see a > problem there. However, the two types may diverge if we define more > instances for any of them after cloning and that may potentially be a > source of confusion? If you want that, then the GeneralizedNewtypeDeriving is the solution. It's not so convinient, as you have to list the instances you need, but on the flip side of the coin is the "explicitness" of the deriving clause. GHC will barf if you forget an import for an instance you want, or if you have unused import. Often redundancy is your friend. Type annotations very often aren't necessary, but it's good practice to write them (e.g. for top-level definitions). So I'd say that not having `clonetype` is a feature. >   > > Haskell's heavy type machinery exists so we can explicitly and > exactly say what we need or want. > > > Mortal programmers would love to have "conveniently" added to that > list :-) >   > -harendra > - Oleg From harendra.kumar at gmail.com Mon Sep 24 20:35:37 2018 From: harendra.kumar at gmail.com (Harendra Kumar) Date: Tue, 25 Sep 2018 02:05:37 +0530 Subject: [Haskell-cafe] clonetype In-Reply-To: <87f90d3d-061a-d16c-be90-3c969c249b4a@iki.fi> References: <87f90d3d-061a-d16c-be90-3c969c249b4a@iki.fi> Message-ID: On Tue, 25 Sep 2018 at 01:12, Oleg Grenrus wrote: > On 24.09.2018 17:06, Harendra Kumar wrote: > > > > > > On Mon, 24 Sep 2018 at 18:17, Oleg Grenrus > > wrote: > > > > The problem is that "All instances" is hard to pin point. We have > > open world assumption, so instances can be added later (in the > > dependency tree). Should they be cloned too? And even of you > > restrict to "instances visible at clonetype definition", that's > > IMHO not a good idea either, as it's implicit and volatile set > > (editing imports changes may change the set). > > > > > > A clone type says "both the types are exactly the same in all > > semantics except that they cannot be used interchangeably", it is just > > like "type" except that the types are treated as being different. The > > way visible instances change for the original type by editing imports, > > the same way they change for the clone type as well, I do not see a > > problem there. However, the two types may diverge if we define more > > instances for any of them after cloning and that may potentially be a > > source of confusion? > > If you want that, then the GeneralizedNewtypeDeriving is the solution. > It's not so convinient, as you have to list the instances you need, but > on the flip side of the coin is the "explicitness" of the deriving > clause. GHC will barf if you forget an import for an instance you want, > or if you have unused import. Often redundancy is your friend. Type > annotations very often aren't necessary, but it's good practice to write > them (e.g. for top-level definitions). So I'd say that not having > `clonetype` is a feature. > > That's where I started. I already use a newtype with GND for this, and it looks like this: newtype Count = Count Int64 deriving ( Eq , Read , Show , Enum , Bounded , Num , Real , Integral , Ord ) The problem is that most programmers are lazy or hard pressed for time and having to write a newtype with a big list of instances actually discourages the use of newtypes freely for this case, they may just make it a habit to let it go. We can't just deny this and say that programmers must be disciplined. They will often try taking the path of least effort. So in practice I am not sure what is better, being explicit or encouraging the use of distinct types and potentially avoiding bugs by doing so. What kind of actual problems/bugs may arise by not being explicit in this particular case? -harendra -------------- next part -------------- An HTML attachment was scrubbed... URL: From erkokl at gmail.com Mon Sep 24 20:42:54 2018 From: erkokl at gmail.com (Levent Erkok) Date: Mon, 24 Sep 2018 13:42:54 -0700 Subject: [Haskell-cafe] clonetype In-Reply-To: References: <87f90d3d-061a-d16c-be90-3c969c249b4a@iki.fi> Message-ID: If you're OK with a little template Haskell and standalone-deriving, then you can use the trick discussed here: https://stackoverflow.com/questions/45113205/is-there-a-way-to-shorten-this-deriving-clause Here's a concrete implementation in my case: https://github.com/LeventErkok/sbv/blob/master/Data/SBV.hs#L407-L420 And a use-case: https://github.com/LeventErkok/sbv/blob/master/Documentation/SBV/Examples/Queries/Enums.hs#L23-L27 Without that trick, the line would've looked like almost like what you had to write with `Count`. I've used this trick for quite some time now, and it's both cheap and quite effective. I agree that a directly supported `deriving` syntax would be nicer, but TH fits the bill well here. -Levent. On Mon, Sep 24, 2018 at 1:36 PM Harendra Kumar wrote: > > > On Tue, 25 Sep 2018 at 01:12, Oleg Grenrus wrote: > >> On 24.09.2018 17:06, Harendra Kumar wrote: >> > >> > >> > On Mon, 24 Sep 2018 at 18:17, Oleg Grenrus > > > wrote: >> > >> > The problem is that "All instances" is hard to pin point. We have >> > open world assumption, so instances can be added later (in the >> > dependency tree). Should they be cloned too? And even of you >> > restrict to "instances visible at clonetype definition", that's >> > IMHO not a good idea either, as it's implicit and volatile set >> > (editing imports changes may change the set). >> > >> > >> > A clone type says "both the types are exactly the same in all >> > semantics except that they cannot be used interchangeably", it is just >> > like "type" except that the types are treated as being different. The >> > way visible instances change for the original type by editing imports, >> > the same way they change for the clone type as well, I do not see a >> > problem there. However, the two types may diverge if we define more >> > instances for any of them after cloning and that may potentially be a >> > source of confusion? >> >> If you want that, then the GeneralizedNewtypeDeriving is the solution. >> It's not so convinient, as you have to list the instances you need, but >> on the flip side of the coin is the "explicitness" of the deriving >> clause. GHC will barf if you forget an import for an instance you want, >> or if you have unused import. Often redundancy is your friend. Type >> annotations very often aren't necessary, but it's good practice to write >> them (e.g. for top-level definitions). So I'd say that not having >> `clonetype` is a feature. >> >> > That's where I started. I already use a newtype with GND for this, and it > looks like this: > > newtype Count = Count Int64 > deriving ( Eq > , Read > , Show > , Enum > , Bounded > , Num > , Real > , Integral > , Ord > ) > > The problem is that most programmers are lazy or hard pressed for time and > having to write a newtype with a big list of instances actually discourages > the use of newtypes freely for this case, they may just make it a habit to > let it go. We can't just deny this and say that programmers must be > disciplined. They will often try taking the path of least effort. So in > practice I am not sure what is better, being explicit or encouraging the > use of distinct types and potentially avoiding bugs by doing so. What kind > of actual problems/bugs may arise by not being explicit in this particular > case? > > -harendra > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From harendra.kumar at gmail.com Mon Sep 24 21:09:44 2018 From: harendra.kumar at gmail.com (Harendra Kumar) Date: Tue, 25 Sep 2018 02:39:44 +0530 Subject: [Haskell-cafe] clonetype In-Reply-To: References: <87f90d3d-061a-d16c-be90-3c969c249b4a@iki.fi> Message-ID: Since this is a basic use case, I was originally looking for a language solution that is extremely easy to use, intuitive and can be used by newcomers without having to learn the tricks. The TH solution solves the repetition problem, but not the initial inertia to use it, and we will have to use TH in almost all our programs. The deriving synonym extension that Li-yao mentioned earlier is perhaps a better solution than TH if implemented. -harendra On Tue, 25 Sep 2018 at 02:13, Levent Erkok wrote: > If you're OK with a little template Haskell and standalone-deriving, then > you can use the trick discussed here: > > > https://stackoverflow.com/questions/45113205/is-there-a-way-to-shorten-this-deriving-clause > > Here's a concrete implementation in my case: > > https://github.com/LeventErkok/sbv/blob/master/Data/SBV.hs#L407-L420 > > And a use-case: > > > https://github.com/LeventErkok/sbv/blob/master/Documentation/SBV/Examples/Queries/Enums.hs#L23-L27 > > Without that trick, the line would've looked like almost like what you had > to write with `Count`. > > I've used this trick for quite some time now, and it's both cheap and > quite effective. I agree that a directly supported `deriving` syntax would > be nicer, but TH fits the bill well here. > > -Levent. > > On Mon, Sep 24, 2018 at 1:36 PM Harendra Kumar > wrote: > >> >> >> On Tue, 25 Sep 2018 at 01:12, Oleg Grenrus wrote: >> >>> On 24.09.2018 17:06, Harendra Kumar wrote: >>> > >>> > >>> > On Mon, 24 Sep 2018 at 18:17, Oleg Grenrus >> > > wrote: >>> > >>> > The problem is that "All instances" is hard to pin point. We have >>> > open world assumption, so instances can be added later (in the >>> > dependency tree). Should they be cloned too? And even of you >>> > restrict to "instances visible at clonetype definition", that's >>> > IMHO not a good idea either, as it's implicit and volatile set >>> > (editing imports changes may change the set). >>> > >>> > >>> > A clone type says "both the types are exactly the same in all >>> > semantics except that they cannot be used interchangeably", it is just >>> > like "type" except that the types are treated as being different. The >>> > way visible instances change for the original type by editing imports, >>> > the same way they change for the clone type as well, I do not see a >>> > problem there. However, the two types may diverge if we define more >>> > instances for any of them after cloning and that may potentially be a >>> > source of confusion? >>> >>> If you want that, then the GeneralizedNewtypeDeriving is the solution. >>> It's not so convinient, as you have to list the instances you need, but >>> on the flip side of the coin is the "explicitness" of the deriving >>> clause. GHC will barf if you forget an import for an instance you want, >>> or if you have unused import. Often redundancy is your friend. Type >>> annotations very often aren't necessary, but it's good practice to write >>> them (e.g. for top-level definitions). So I'd say that not having >>> `clonetype` is a feature. >>> >>> >> That's where I started. I already use a newtype with GND for this, and it >> looks like this: >> >> newtype Count = Count Int64 >> deriving ( Eq >> , Read >> , Show >> , Enum >> , Bounded >> , Num >> , Real >> , Integral >> , Ord >> ) >> >> The problem is that most programmers are lazy or hard pressed for time >> and having to write a newtype with a big list of instances actually >> discourages the use of newtypes freely for this case, they may just make it >> a habit to let it go. We can't just deny this and say that programmers must >> be disciplined. They will often try taking the path of least effort. So in >> practice I am not sure what is better, being explicit or encouraging the >> use of distinct types and potentially avoiding bugs by doing so. What kind >> of actual problems/bugs may arise by not being explicit in this particular >> case? >> >> -harendra >> _______________________________________________ >> Haskell-Cafe mailing list >> To (un)subscribe, modify options or view archives go to: >> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe >> Only members subscribed via the mailman list are allowed to post. > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From roehst at gmail.com Tue Sep 25 00:00:36 2018 From: roehst at gmail.com (Rodrigo Stevaux) Date: Mon, 24 Sep 2018 21:00:36 -0300 Subject: [Haskell-cafe] Is it possible to change the environment (reader) in applicative style? Message-ID: Hi, I found a paper from the Greats exactly about desugaring monads into applicatives: The type of >>= allows the second computation (f b) to depend on the result a of the first, whereas <*> does not. This is the essence of the difference between Monad and Applicative; Monad allows dependencies on previous results, whereas Applicative does not. "Desugaring Haskell’s do-Notation into Applicative Operations" by marlow, SPJ, kmett and mokhov From qdunkan at gmail.com Tue Sep 25 00:19:00 2018 From: qdunkan at gmail.com (Evan Laforge) Date: Mon, 24 Sep 2018 17:19:00 -0700 Subject: [Haskell-cafe] clonetype In-Reply-To: References: <87f90d3d-061a-d16c-be90-3c969c249b4a@iki.fi> Message-ID: For some reason I thought ConstraintKinds would let you do: type UsualStuff a = (Eq a, Read a, ...) newtype .. deriving (UsualStuff) No such luck apparently! On Mon, Sep 24, 2018 at 1:36 PM Harendra Kumar wrote: > > > > On Tue, 25 Sep 2018 at 01:12, Oleg Grenrus wrote: >> >> On 24.09.2018 17:06, Harendra Kumar wrote: >> > >> > >> > On Mon, 24 Sep 2018 at 18:17, Oleg Grenrus > > > wrote: >> > >> > The problem is that "All instances" is hard to pin point. We have >> > open world assumption, so instances can be added later (in the >> > dependency tree). Should they be cloned too? And even of you >> > restrict to "instances visible at clonetype definition", that's >> > IMHO not a good idea either, as it's implicit and volatile set >> > (editing imports changes may change the set). >> > >> > >> > A clone type says "both the types are exactly the same in all >> > semantics except that they cannot be used interchangeably", it is just >> > like "type" except that the types are treated as being different. The >> > way visible instances change for the original type by editing imports, >> > the same way they change for the clone type as well, I do not see a >> > problem there. However, the two types may diverge if we define more >> > instances for any of them after cloning and that may potentially be a >> > source of confusion? >> >> If you want that, then the GeneralizedNewtypeDeriving is the solution. >> It's not so convinient, as you have to list the instances you need, but >> on the flip side of the coin is the "explicitness" of the deriving >> clause. GHC will barf if you forget an import for an instance you want, >> or if you have unused import. Often redundancy is your friend. Type >> annotations very often aren't necessary, but it's good practice to write >> them (e.g. for top-level definitions). So I'd say that not having >> `clonetype` is a feature. >> > > That's where I started. I already use a newtype with GND for this, and it looks like this: > > newtype Count = Count Int64 > deriving ( Eq > , Read > , Show > , Enum > , Bounded > , Num > , Real > , Integral > , Ord > ) > > The problem is that most programmers are lazy or hard pressed for time and having to write a newtype with a big list of instances actually discourages the use of newtypes freely for this case, they may just make it a habit to let it go. We can't just deny this and say that programmers must be disciplined. They will often try taking the path of least effort. So in practice I am not sure what is better, being explicit or encouraging the use of distinct types and potentially avoiding bugs by doing so. What kind of actual problems/bugs may arise by not being explicit in this particular case? > > -harendra > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. From tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk Wed Sep 26 10:27:33 2018 From: tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk (Tom Ellis) Date: Wed, 26 Sep 2018 11:27:33 +0100 Subject: [Haskell-cafe] Where is "minimumsBy"? Message-ID: <20180926102733.be2wjtiz5ymuqrdo@weber> Data.List.minimumBy :: Foldable t => (a -> a -> Ordering) -> t a -> a https://www.stackage.org/haddock/lts-12.1/base-4.11.1.0/Data-List.html#v:minimumBy but there are many cases where that's quite unhelpful. Actually what we want is more like minimumsBy :: ... => (a -> a -> Ordering) -> t a -> [a] There can be many distinct minimizers. For example when I want to get the collection of the youngest people from [(Age, Person)] I want minimumsBy (compare `on` fst) [(12, alice), (15, balaji), (12, cho)] to return [(12, alice), (12, cho)] Does "minimumsBy" exist somewhere reasonably standard? Hoogle doesn't throw up anything obvious https://www.stackage.org/lts-12.1/hoogle?q=%28a+-%3E+a+-%3E+Ordering%29+-%3E+t+a+-%3E+%5Ba%5D Thanks, Tom From wolfgang-it at jeltsch.info Wed Sep 26 12:41:29 2018 From: wolfgang-it at jeltsch.info (Wolfgang Jeltsch) Date: Wed, 26 Sep 2018 15:41:29 +0300 Subject: [Haskell-cafe] clonetype In-Reply-To: References: <87f90d3d-061a-d16c-be90-3c969c249b4a@iki.fi> Message-ID: <1537965689.28833.217.camel@jeltsch.info> Am Dienstag, den 25.09.2018, 02:05 +0530 schrieb Harendra Kumar: > That's where I started. I already use a newtype with GND for this, and > it looks like this: > > newtype Count = Count Int64 >     deriving ( Eq >              , Read >              , Show >              , Enum >              , Bounded >              , Num >              , Real >              , Integral >              , Ord >              ) > > The problem is that most programmers are lazy or hard pressed for time > and having to write a newtype with a big list of instances actually > discourages the use of newtypes freely for this case, they may just > make it a habit to let it go. We can't just deny this and say that > programmers must be disciplined. They will often try taking the path > of least effort. I think that the time it takes to come up with and write down such explicit lists is usually small compared to the time it takes to do all the other development. And once you have made the instantiation lists explicit, you will probably save time in the future, because bugs will detected automatically more often. The latter point is something that is often overlooked: people are under time pressure and strive for quick solutions but spend more time in the long run this way. All the best, Wolfgang -------------- next part -------------- An HTML attachment was scrubbed... URL: From allbery.b at gmail.com Wed Sep 26 14:13:21 2018 From: allbery.b at gmail.com (Brandon Allbery) Date: Wed, 26 Sep 2018 10:13:21 -0400 Subject: [Haskell-cafe] Where is "minimumsBy"? In-Reply-To: <20180926102733.be2wjtiz5ymuqrdo@weber> References: <20180926102733.be2wjtiz5ymuqrdo@weber> Message-ID: Not exactly that, but you can use groupBy fst . sort, then the head of the result list is your "minimumsBy" result. On Wed, Sep 26, 2018 at 6:28 AM Tom Ellis < tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk> wrote: > Data.List.minimumBy :: Foldable t => (a -> a -> Ordering) -> t a -> a > > > https://www.stackage.org/haddock/lts-12.1/base-4.11.1.0/Data-List.html#v:minimumBy > > but there are many cases where that's quite unhelpful. Actually what we > want is more like > > minimumsBy :: ... => (a -> a -> Ordering) -> t a -> [a] > > There can be many distinct minimizers. For example when I want to get the > collection of the youngest people from [(Age, Person)] I want > > minimumsBy (compare `on` fst) [(12, alice), (15, balaji), (12, cho)] > > to return > > [(12, alice), (12, cho)] > > Does "minimumsBy" exist somewhere reasonably standard? Hoogle doesn't > throw > up anything obvious > > > https://www.stackage.org/lts-12.1/hoogle?q=%28a+-%3E+a+-%3E+Ordering%29+-%3E+t+a+-%3E+%5Ba%5D > > Thanks, > > Tom > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -- brandon s allbery kf8nh allbery.b at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk Wed Sep 26 14:22:33 2018 From: tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk (Tom Ellis) Date: Wed, 26 Sep 2018 15:22:33 +0100 Subject: [Haskell-cafe] Where is "minimumsBy"? In-Reply-To: References: <20180926102733.be2wjtiz5ymuqrdo@weber> Message-ID: <20180926142233.5um4vlffbdpy7ood@weber> Hopefully laziness makes the complexity work out fine. Nonetheless I don't like relying on laziness for the correct complexity and it would still be nice to have an explicit version. On Wed, Sep 26, 2018 at 10:13:21AM -0400, Brandon Allbery wrote: > Not exactly that, but you can use groupBy fst . sort, then the head of the > result list is your "minimumsBy" result. > > On Wed, Sep 26, 2018 at 6:28 AM Tom Ellis < > tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk> wrote: > > Data.List.minimumBy :: Foldable t => (a -> a -> Ordering) -> t a -> a > > > > > > https://www.stackage.org/haddock/lts-12.1/base-4.11.1.0/Data-List.html#v:minimumBy > > > > but there are many cases where that's quite unhelpful. Actually what we > > want is more like > > > > minimumsBy :: ... => (a -> a -> Ordering) -> t a -> [a] > > > > There can be many distinct minimizers. For example when I want to get the > > collection of the youngest people from [(Age, Person)] I want > > > > minimumsBy (compare `on` fst) [(12, alice), (15, balaji), (12, cho)] > > > > to return > > > > [(12, alice), (12, cho)] > > > > Does "minimumsBy" exist somewhere reasonably standard? Hoogle doesn't > > throw > > up anything obvious > > > > > > https://www.stackage.org/lts-12.1/hoogle?q=%28a+-%3E+a+-%3E+Ordering%29+-%3E+t+a+-%3E+%5Ba%5D From david.feuer at gmail.com Wed Sep 26 14:30:14 2018 From: david.feuer at gmail.com (David Feuer) Date: Wed, 26 Sep 2018 10:30:14 -0400 Subject: [Haskell-cafe] Where is "minimumsBy"? In-Reply-To: <20180926142233.5um4vlffbdpy7ood@weber> References: <20180926102733.be2wjtiz5ymuqrdo@weber> <20180926142233.5um4vlffbdpy7ood@weber> Message-ID: Laziness does not make the complexity work out fine. Sorting is still O(n log n), which isn't needed here. On Wed, Sep 26, 2018, 10:22 AM Tom Ellis < tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk> wrote: > Hopefully laziness makes the complexity work out fine. Nonetheless I don't > like relying on laziness for the correct complexity and it would still be > nice to have an explicit version. > > On Wed, Sep 26, 2018 at 10:13:21AM -0400, Brandon Allbery wrote: > > Not exactly that, but you can use groupBy fst . sort, then the head of > the > > result list is your "minimumsBy" result. > > > > On Wed, Sep 26, 2018 at 6:28 AM Tom Ellis < > > tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk> wrote: > > > Data.List.minimumBy :: Foldable t => (a -> a -> Ordering) -> t a -> a > > > > > > > > > > https://www.stackage.org/haddock/lts-12.1/base-4.11.1.0/Data-List.html#v:minimumBy > > > > > > but there are many cases where that's quite unhelpful. Actually what > we > > > want is more like > > > > > > minimumsBy :: ... => (a -> a -> Ordering) -> t a -> [a] > > > > > > There can be many distinct minimizers. For example when I want to get > the > > > collection of the youngest people from [(Age, Person)] I want > > > > > > minimumsBy (compare `on` fst) [(12, alice), (15, balaji), (12, > cho)] > > > > > > to return > > > > > > [(12, alice), (12, cho)] > > > > > > Does "minimumsBy" exist somewhere reasonably standard? Hoogle doesn't > > > throw > > > up anything obvious > > > > > > > > > > https://www.stackage.org/lts-12.1/hoogle?q=%28a+-%3E+a+-%3E+Ordering%29+-%3E+t+a+-%3E+%5Ba%5D > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From bob at redivi.com Wed Sep 26 14:38:42 2018 From: bob at redivi.com (Bob Ippolito) Date: Wed, 26 Sep 2018 10:38:42 -0400 Subject: [Haskell-cafe] Where is "minimumsBy"? In-Reply-To: References: <20180926102733.be2wjtiz5ymuqrdo@weber> <20180926142233.5um4vlffbdpy7ood@weber> Message-ID: Haskell’s sort algorithm is linear complexity when only evaluating the front of the list. See also https://ro-che.info/articles/2016-04-02-descending-sort-haskell which includes some measurements. On Wed, Sep 26, 2018 at 10:30 David Feuer wrote: > Laziness does not make the complexity work out fine. Sorting is still O(n > log n), which isn't needed here. > > On Wed, Sep 26, 2018, 10:22 AM Tom Ellis < > tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk> wrote: > >> Hopefully laziness makes the complexity work out fine. Nonetheless I >> don't >> like relying on laziness for the correct complexity and it would still be >> nice to have an explicit version. >> >> On Wed, Sep 26, 2018 at 10:13:21AM -0400, Brandon Allbery wrote: >> > Not exactly that, but you can use groupBy fst . sort, then the head of >> the >> > result list is your "minimumsBy" result. >> > >> > On Wed, Sep 26, 2018 at 6:28 AM Tom Ellis < >> > tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk> wrote: >> > > Data.List.minimumBy :: Foldable t => (a -> a -> Ordering) -> t a -> a >> > > >> > > >> > > >> https://www.stackage.org/haddock/lts-12.1/base-4.11.1.0/Data-List.html#v:minimumBy >> > > >> > > but there are many cases where that's quite unhelpful. Actually what >> we >> > > want is more like >> > > >> > > minimumsBy :: ... => (a -> a -> Ordering) -> t a -> [a] >> > > >> > > There can be many distinct minimizers. For example when I want to >> get the >> > > collection of the youngest people from [(Age, Person)] I want >> > > >> > > minimumsBy (compare `on` fst) [(12, alice), (15, balaji), (12, >> cho)] >> > > >> > > to return >> > > >> > > [(12, alice), (12, cho)] >> > > >> > > Does "minimumsBy" exist somewhere reasonably standard? Hoogle doesn't >> > > throw >> > > up anything obvious >> > > >> > > >> > > >> https://www.stackage.org/lts-12.1/hoogle?q=%28a+-%3E+a+-%3E+Ordering%29+-%3E+t+a+-%3E+%5Ba%5D >> _______________________________________________ >> Haskell-Cafe mailing list >> To (un)subscribe, modify options or view archives go to: >> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe >> Only members subscribed via the mailman list are allowed to post. > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From harendra.kumar at gmail.com Wed Sep 26 14:49:50 2018 From: harendra.kumar at gmail.com (Harendra Kumar) Date: Wed, 26 Sep 2018 20:19:50 +0530 Subject: [Haskell-cafe] clonetype In-Reply-To: <1537965689.28833.217.camel@jeltsch.info> References: <87f90d3d-061a-d16c-be90-3c969c249b4a@iki.fi> <1537965689.28833.217.camel@jeltsch.info> Message-ID: On Wed, 26 Sep 2018 at 18:11, Wolfgang Jeltsch wrote: > Am Dienstag, den 25.09.2018, 02:05 +0530 schrieb Harendra Kumar: > > That's where I started. I already use a newtype with GND for this, and it > looks like this: > > *newtype* Count = Count Int64 > *deriving* ( Eq > , Read > , Show > , Enum > , Bounded > , Num > , Real > , Integral > , Ord > ) > > The problem is that most programmers are lazy or hard pressed for time and > having to write a *newtype* with a big list of instances actually > discourages the use of *newtype*s freely for this case, they may just > make it a habit to let it go. We can't just deny this and say that > programmers must be disciplined. They will often try taking the path of > least effort. > > > I think that the time it takes to come up with and write down such > explicit lists is usually small compared to the time it takes to do all the > other development. And once you have made the instantiation lists explicit, > you will probably save time in the future, because bugs will detected > automatically more often. The latter point is something that is often > overlooked: people are under time pressure and strive for quick solutions > but spend more time in the long run this way. > Two quick thoughts: 1) Nobody has pointed out what kind of bugs (with specific examples) will arise if we have something like clonetype. Are those bugs more dangerous or will consume more time compared to what we are trying to avoid in the first place? I am just trying to learn more about it, not claiming that this is better. 2) It is a real unsolvable problem that people take shortcuts when available, people will be people; this is also one of the reasons why Haskell is not so successful, other languages are easy in the short run. If we accept that this a fact of life, we have two options in general, (1) provide a safer shorter route so that we automatically choose that one (2) close the unsafe shorter route to force ourselves to choose the safe one. I was trying to explore if there is a solution on the lines of the first option. The second option means that we should not allow two arguments of the same type in a function, forcing them to always make a newtype, perhaps a much more draconian solution and not worth the pain. -harendra -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.feuer at gmail.com Wed Sep 26 14:51:19 2018 From: david.feuer at gmail.com (David Feuer) Date: Wed, 26 Sep 2018 10:51:19 -0400 Subject: [Haskell-cafe] Where is "minimumsBy"? In-Reply-To: References: <20180926102733.be2wjtiz5ymuqrdo@weber> <20180926142233.5um4vlffbdpy7ood@weber> Message-ID: Ah, right... Sorry. On Wed, Sep 26, 2018, 10:38 AM Bob Ippolito wrote: > Haskell’s sort algorithm is linear complexity when only evaluating the > front of the list. See also > https://ro-che.info/articles/2016-04-02-descending-sort-haskell which > includes some measurements. > > On Wed, Sep 26, 2018 at 10:30 David Feuer wrote: > >> Laziness does not make the complexity work out fine. Sorting is still O(n >> log n), which isn't needed here. >> >> On Wed, Sep 26, 2018, 10:22 AM Tom Ellis < >> tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk> wrote: >> >>> Hopefully laziness makes the complexity work out fine. Nonetheless I >>> don't >>> like relying on laziness for the correct complexity and it would still be >>> nice to have an explicit version. >>> >>> On Wed, Sep 26, 2018 at 10:13:21AM -0400, Brandon Allbery wrote: >>> > Not exactly that, but you can use groupBy fst . sort, then the head of >>> the >>> > result list is your "minimumsBy" result. >>> > >>> > On Wed, Sep 26, 2018 at 6:28 AM Tom Ellis < >>> > tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk> wrote: >>> > > Data.List.minimumBy :: Foldable t => (a -> a -> Ordering) -> t a -> a >>> > > >>> > > >>> > > >>> https://www.stackage.org/haddock/lts-12.1/base-4.11.1.0/Data-List.html#v:minimumBy >>> > > >>> > > but there are many cases where that's quite unhelpful. Actually >>> what we >>> > > want is more like >>> > > >>> > > minimumsBy :: ... => (a -> a -> Ordering) -> t a -> [a] >>> > > >>> > > There can be many distinct minimizers. For example when I want to >>> get the >>> > > collection of the youngest people from [(Age, Person)] I want >>> > > >>> > > minimumsBy (compare `on` fst) [(12, alice), (15, balaji), (12, >>> cho)] >>> > > >>> > > to return >>> > > >>> > > [(12, alice), (12, cho)] >>> > > >>> > > Does "minimumsBy" exist somewhere reasonably standard? Hoogle >>> doesn't >>> > > throw >>> > > up anything obvious >>> > > >>> > > >>> > > >>> https://www.stackage.org/lts-12.1/hoogle?q=%28a+-%3E+a+-%3E+Ordering%29+-%3E+t+a+-%3E+%5Ba%5D >>> _______________________________________________ >>> Haskell-Cafe mailing list >>> To (un)subscribe, modify options or view archives go to: >>> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe >>> Only members subscribed via the mailman list are allowed to post. >> >> _______________________________________________ >> Haskell-Cafe mailing list >> To (un)subscribe, modify options or view archives go to: >> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe >> Only members subscribed via the mailman list are allowed to post. > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From konn.jinro at gmail.com Wed Sep 26 14:52:02 2018 From: konn.jinro at gmail.com (Hiromi ISHII) Date: Wed, 26 Sep 2018 23:52:02 +0900 Subject: [Haskell-cafe] Where is "minimumsBy"? In-Reply-To: <20180926102733.be2wjtiz5ymuqrdo@weber> References: <20180926102733.be2wjtiz5ymuqrdo@weber> Message-ID: Not just a one-shot function, but you can do it with monoids: ``` import Data.Semigroup (Option(..), Semigroup(..)) import Data.DList maximumsOn :: Ord b => (a -> b) -> [a] -> Maybe [a] maximumsOn f = fmap (toList . elts) . getOption . foldMap (\a -> Option $ Just $ MaxAll (f a) $ pure a) -- You don't need `Option`s since GHC 8.4 data MaxAll a b = MaxAll { weight :: a, elts :: DList b } instance Ord a => Semigroup (MaxAll a b) where MaxAll l ls <> MaxAll r rs = case compare l r of EQ -> MaxAll l (ls <> rs) LT -> MaxAll l rs GT -> MaxAll r ls ``` > 2018/09/26 19:27、Tom Ellis のメール: > > Data.List.minimumBy :: Foldable t => (a -> a -> Ordering) -> t a -> a > > https://www.stackage.org/haddock/lts-12.1/base-4.11.1.0/Data-List.html#v:minimumBy > > but there are many cases where that's quite unhelpful. Actually what we > want is more like > > minimumsBy :: ... => (a -> a -> Ordering) -> t a -> [a] > > There can be many distinct minimizers. For example when I want to get the > collection of the youngest people from [(Age, Person)] I want > > minimumsBy (compare `on` fst) [(12, alice), (15, balaji), (12, cho)] > > to return > > [(12, alice), (12, cho)] > > Does "minimumsBy" exist somewhere reasonably standard? Hoogle doesn't throw > up anything obvious > > https://www.stackage.org/lts-12.1/hoogle?q=%28a+-%3E+a+-%3E+Ordering%29+-%3E+t+a+-%3E+%5Ba%5D > > Thanks, > > Tom > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. ----- 石井 大海 --------------------------- konn.jinro at gmail.com 筑波大学数理物質科学研究科 数学専攻 博士後期課程三年 ---------------------------------------------- -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 488 bytes Desc: Message signed with OpenPGP URL: From ietf-dane at dukhovni.org Wed Sep 26 15:03:53 2018 From: ietf-dane at dukhovni.org (Viktor Dukhovni) Date: Wed, 26 Sep 2018 11:03:53 -0400 Subject: [Haskell-cafe] Where is "minimumsBy"? In-Reply-To: <20180926102733.be2wjtiz5ymuqrdo@weber> References: <20180926102733.be2wjtiz5ymuqrdo@weber> Message-ID: <34579E7D-BA3A-47A1-B25D-21FA21BB1912@dukhovni.org> > On Sep 26, 2018, at 6:27 AM, Tom Ellis wrote: > > Actually what we > want is more like > > minimumsBy :: ... => (a -> a -> Ordering) -> t a -> [a] > > There can be many distinct minimizers. For example when I want to get the > collection of the youngest people from [(Age, Person)] I want > > minimumsBy (compare `on` fst) [(12, alice), (15, balaji), (12, cho)] > > to return > > [(12, alice), (12, cho)] It is a rather elementary function: import Data.Foldable import Data.Ord -- Stable version that keeps the input order for elements that are -- equal. If this were to be a library function, I'd drop the -- 'reverse' post-processing step, and leave the choice of stability -- to the caller. -- minimumsBy :: Foldable t => (a -> a -> Ordering) -> t a -> [a] minimumsBy cmp xs = reverse $ foldl' acc [] xs where acc [] x = [x] acc mins@(m:_) x = case cmp m x of LT -> mins EQ -> x:mins GT -> [x] -- Viktor. From david.feuer at gmail.com Wed Sep 26 16:24:15 2018 From: david.feuer at gmail.com (David Feuer) Date: Wed, 26 Sep 2018 12:24:15 -0400 Subject: [Haskell-cafe] Where is "minimumsBy"? In-Reply-To: <34579E7D-BA3A-47A1-B25D-21FA21BB1912@dukhovni.org> References: <20180926102733.be2wjtiz5ymuqrdo@weber> <34579E7D-BA3A-47A1-B25D-21FA21BB1912@dukhovni.org> Message-ID: That's exactly the approach I was thinking of. Leaving off the `reverse` saves some time in cases where it's not required. Of course, there's also a perfectly reasonable version using foldr', in case the data structure leans the other way. One variation or another of this approach should be a decent constant factor faster than partially sorting the list. On Wed, Sep 26, 2018 at 11:03 AM, Viktor Dukhovni wrote: >> On Sep 26, 2018, at 6:27 AM, Tom Ellis wrote: >> >> Actually what we >> want is more like >> >> minimumsBy :: ... => (a -> a -> Ordering) -> t a -> [a] >> >> There can be many distinct minimizers. For example when I want to get the >> collection of the youngest people from [(Age, Person)] I want >> >> minimumsBy (compare `on` fst) [(12, alice), (15, balaji), (12, cho)] >> >> to return >> >> [(12, alice), (12, cho)] > > It is a rather elementary function: > > import Data.Foldable > import Data.Ord > > -- Stable version that keeps the input order for elements that are > -- equal. If this were to be a library function, I'd drop the > -- 'reverse' post-processing step, and leave the choice of stability > -- to the caller. > -- > minimumsBy :: Foldable t => (a -> a -> Ordering) -> t a -> [a] > minimumsBy cmp xs = reverse $ foldl' acc [] xs > where > acc [] x = [x] > acc mins@(m:_) x = case cmp m x of > LT -> mins > EQ -> x:mins > GT -> [x] > > -- > Viktor. > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. From svenpanne at gmail.com Wed Sep 26 18:17:50 2018 From: svenpanne at gmail.com (Sven Panne) Date: Wed, 26 Sep 2018 20:17:50 +0200 Subject: [Haskell-cafe] clonetype In-Reply-To: References: <87f90d3d-061a-d16c-be90-3c969c249b4a@iki.fi> <1537965689.28833.217.camel@jeltsch.info> Message-ID: Am Mi., 26. Sep. 2018 um 16:50 Uhr schrieb Harendra Kumar < harendra.kumar at gmail.com>: > 1) Nobody has pointed out what kind of bugs (with specific examples) will > arise if we have something like clonetype. Are those bugs more dangerous or > will consume more time compared to what we are trying to avoid in the first > place? I am just trying to learn more about it, not claiming that this is > better. > I think the main danger here is that it is totally unclear what actually gets derived. Is there a written specification of what actually should be derived? How would such an extension interact with other already existing extensions and separate compilation/instances added at a later time? If this can't be specified exactly, concisely and intuitively, you *will* have a maintenance nightmare, just like with all implicit things. > > 2) It is a real unsolvable problem that people take shortcuts when > available, > No, this is not unsolvable. People, especially newcomers to writing SW, must be educated, otherwise they will have to repeat the mistakes already made by lots of other people (a.k.a. "learning the hard way"): * What initially looks like a good cunning idea and/or like a shortcut will almost always turn out to be a nightmare later during maintenance and debugging. * Explicit is better than implicit. Note that this doesn't necessarily mean that you have to repeat yourself. * Only a tiny amount (I think I've read 5-10% several times) of time is actually spent programming things, the rest is spent understanding the problem, reading code (from other people or often even worse: your former past ;-), debugging and extending existing SW. Trying to optimize for the tiny fraction doesn't look like a good idea. This manifests in the mantra: "A good programming language doesn't make it easy to write correct SW, it makes it hard to write incorrect SW." > people will be people; this is also one of the reasons why Haskell is not > so successful, other languages are easy in the short run. If we accept that > this a fact of life, [...] > This shouldn't easily be accepted, quite the opposite. A lot of users of "easy" and "concise" programming languages have learned the hard way that their beloved language doesn't scale at all, leading to the development of TypeScript (extending JavaScript), mypy (extending Python), Hack (extending PHP), etc. What can be accepted as a fact of life is that there is often a tempting short route which is totally fine for throw-away scripts, quick hacks, etc., and there is a longer route, investing into the future. You have to choose... Coming back to the problem at hand: Even if we find a way to factor out the deriving-clauses to stay explicit, e.g. via the deriving synonyms proposal, I am not so sure if this is a good idea. How can you be sure when changing such a synonym that *all* affected types should really be changed? This would again be a maintenance nightmare. -------------- next part -------------- An HTML attachment was scrubbed... URL: From vandijk.roel at gmail.com Thu Sep 27 12:45:42 2018 From: vandijk.roel at gmail.com (Roel van Dijk) Date: Thu, 27 Sep 2018 14:45:42 +0200 Subject: [Haskell-cafe] Suppress re-exported module's docs ... In-Reply-To: References: Message-ID: I believe the following is a common pattern: module A ( module Exports ) where import B as Exports import C as Exports Op vr 21 sep. 2018 13:15 schreef aditya siram : > Is there any way to have Haddock ignore a re-exported module without > hiding it? eg. here all of `B`'s exports/docs show up in `A`'s Haddock page > and I don't want to see them there: > > module A > ( > ... > module B > ) > import B > ... > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From trupill at gmail.com Thu Sep 27 18:04:02 2018 From: trupill at gmail.com (Alejandro Serrano Mena) Date: Thu, 27 Sep 2018 20:04:02 +0200 Subject: [Haskell-cafe] Status of Cloud Haskell Message-ID: Dear Café, I was wondering what is the current status of Cloud Haskell. Their website only mentions things until 2016, but I see that the 'distributed-process' library is itself well-maintained. It Cloud Haskell still the way to go for distributed processes in Haskell? Regards, Alejandro -------------- next part -------------- An HTML attachment was scrubbed... URL: From c.sternagel at gmail.com Fri Sep 28 05:16:07 2018 From: c.sternagel at gmail.com (Christian Sternagel) Date: Fri, 28 Sep 2018 14:16:07 +0900 Subject: [Haskell-cafe] Where is "minimumsBy"? In-Reply-To: References: <20180926102733.be2wjtiz5ymuqrdo@weber> <20180926142233.5um4vlffbdpy7ood@weber> Message-ID: That is interesting. Is anybody aware of a more detailed justification of how lazy evaluation makes this happen? - chris On 09/26/2018 11:38 PM, Bob Ippolito wrote: > Haskell’s sort algorithm is linear complexity when only evaluating the > front of the list. See also  > https://ro-che.info/articles/2016-04-02-descending-sort-haskell which > includes some measurements.  > > On Wed, Sep 26, 2018 at 10:30 David Feuer > wrote: > > Laziness does not make the complexity work out fine. Sorting is > still O(n log n), which isn't needed here. > > On Wed, Sep 26, 2018, 10:22 AM Tom Ellis > > wrote: > > Hopefully laziness makes the complexity work out fine.  > Nonetheless I don't > like relying on laziness for the correct complexity and it would > still be > nice to have an explicit version. > > On Wed, Sep 26, 2018 at 10:13:21AM -0400, Brandon Allbery wrote: > > Not exactly that, but you can use groupBy fst . sort, then the > head of the > > result list is your "minimumsBy" result. > > > > On Wed, Sep 26, 2018 at 6:28 AM Tom Ellis < > > tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk > > wrote: > > > Data.List.minimumBy :: Foldable t => (a -> a -> Ordering) -> > t a -> a > > > > > > > > > > https://www.stackage.org/haddock/lts-12.1/base-4.11.1.0/Data-List.html#v:minimumBy > > > > > > but there are many cases where that's quite unhelpful.  > Actually what we > > > want is more like > > > > > >     minimumsBy :: ... => (a -> a -> Ordering) -> t a -> [a] > > > > > > There can be many distinct minimizers.  For example when I > want to get the > > > collection of the youngest people from [(Age, Person)] I want > > > > > >     minimumsBy (compare `on` fst) [(12, alice), (15, > balaji), (12, cho)] > > > > > > to return > > > > > >     [(12, alice), (12, cho)] > > > > > > Does "minimumsBy" exist somewhere reasonably standard?  > Hoogle doesn't > > > throw > > > up anything obvious > > > > > > > > > > https://www.stackage.org/lts-12.1/hoogle?q=%28a+-%3E+a+-%3E+Ordering%29+-%3E+t+a+-%3E+%5Ba%5D > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. > > > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. > From frank at fstaals.net Fri Sep 28 07:30:53 2018 From: frank at fstaals.net (Frank Staals) Date: Fri, 28 Sep 2018 09:30:53 +0200 Subject: [Haskell-cafe] Where is "minimumsBy"? In-Reply-To: (Christian Sternagel's message of "Fri, 28 Sep 2018 14:16:07 +0900") References: <20180926102733.be2wjtiz5ymuqrdo@weber> <20180926142233.5um4vlffbdpy7ood@weber> Message-ID: <85y3bm9iz6.fsf@UU.FStaals.net> Christian Sternagel writes: > That is interesting. Is anybody aware of a more detailed justification > of how lazy evaluation makes this happen? - chris There might be a more formal argument written down somewhere, but the gist of it should be: Think of the recursion tree of merge sort; leaves correspond to singleton lists, internal nodes correspond to merges of two sorted lists. To determine the minimum element of the merged result you need to consider only the first elements of both these lists. Hence, every node can be charged at most O(1) times. Since the tree has size O(n) the running time is linear. -- - Frank From bob at redivi.com Fri Sep 28 08:23:40 2018 From: bob at redivi.com (Bob Ippolito) Date: Fri, 28 Sep 2018 01:23:40 -0700 Subject: [Haskell-cafe] Where is "minimumsBy"? In-Reply-To: References: <20180926102733.be2wjtiz5ymuqrdo@weber> <20180926142233.5um4vlffbdpy7ood@weber> Message-ID: You can read a recent implementation of sortBy at https://github.com/ghc/ghc/blob/ghc-8.6.1-release/libraries/base/Data/OldList.hs#L943-L970 It often helps to work with a concrete example, such as: head (sort [5, 4, 6, 3, 7, 1]) Expanding that a bit we get: head (mergeAll (sequences [5, 4, 6, 3, 7, 1])) sequences is linear time, it breaks the list up into monotonically increasing sublists (up to n/2 of them) with pairwise comparisons. It doesn't all get evaluated right away, but nothing "interesting" is happening there so let's show it fully evaluated. head (mergeAll [[4, 5], [3, 6], [1, 7]]) Expanding that we get head (mergeAll (mergePairs [[4, 5], [3, 6], [1, 7]]))) head (mergeAll (merge [4, 5] [3, 6] : [1, 7] : [])) head (mergeAll (mergePairs (merge [4, 5] [3, 6] : [1, 7] : []))) head (mergeAll (merge (merge [4, 5] [3, 6]) [1, 7] : []))) head (merge (merge [4, 5] [3, 6]) [1, 7]) head (merge (3 : merge [4, 5] [6]) [1, 7]) head (1 : merge (3 : merge [4, 5] [6]) [7]) 1 This phase is linear time in the worst case, we only compare the first element of each sublist once to find the least element. Having a constant number of phases that are linear (two in this case) is still linear. It would be linearithmic time if we were to fully evaluate the whole sort, but laziness gets to leave a lot of that work unevaluated. -bob On Thu, Sep 27, 2018 at 10:16 PM Christian Sternagel wrote: > That is interesting. Is anybody aware of a more detailed justification > of how lazy evaluation makes this happen? - chris > > On 09/26/2018 11:38 PM, Bob Ippolito wrote: > > Haskell’s sort algorithm is linear complexity when only evaluating the > > front of the list. See also > > https://ro-che.info/articles/2016-04-02-descending-sort-haskell which > > includes some measurements. > > > > On Wed, Sep 26, 2018 at 10:30 David Feuer > > wrote: > > > > Laziness does not make the complexity work out fine. Sorting is > > still O(n log n), which isn't needed here. > > > > On Wed, Sep 26, 2018, 10:22 AM Tom Ellis > > > > wrote: > > > > Hopefully laziness makes the complexity work out fine. > > Nonetheless I don't > > like relying on laziness for the correct complexity and it would > > still be > > nice to have an explicit version. > > > > On Wed, Sep 26, 2018 at 10:13:21AM -0400, Brandon Allbery wrote: > > > Not exactly that, but you can use groupBy fst . sort, then the > > head of the > > > result list is your "minimumsBy" result. > > > > > > On Wed, Sep 26, 2018 at 6:28 AM Tom Ellis < > > > tom-lists-haskell-cafe-2017 at jaguarpaw.co.uk > > > wrote: > > > > Data.List.minimumBy :: Foldable t => (a -> a -> Ordering) -> > > t a -> a > > > > > > > > > > > > > > > https://www.stackage.org/haddock/lts-12.1/base-4.11.1.0/Data-List.html#v:minimumBy > > > > > > > > but there are many cases where that's quite unhelpful. > > Actually what we > > > > want is more like > > > > > > > > minimumsBy :: ... => (a -> a -> Ordering) -> t a -> [a] > > > > > > > > There can be many distinct minimizers. For example when I > > want to get the > > > > collection of the youngest people from [(Age, Person)] I want > > > > > > > > minimumsBy (compare `on` fst) [(12, alice), (15, > > balaji), (12, cho)] > > > > > > > > to return > > > > > > > > [(12, alice), (12, cho)] > > > > > > > > Does "minimumsBy" exist somewhere reasonably standard? > > Hoogle doesn't > > > > throw > > > > up anything obvious > > > > > > > > > > > > > > > https://www.stackage.org/lts-12.1/hoogle?q=%28a+-%3E+a+-%3E+Ordering%29+-%3E+t+a+-%3E+%5Ba%5D > > _______________________________________________ > > Haskell-Cafe mailing list > > To (un)subscribe, modify options or view archives go to: > > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > > Only members subscribed via the mailman list are allowed to post. > > > > _______________________________________________ > > Haskell-Cafe mailing list > > To (un)subscribe, modify options or view archives go to: > > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > > Only members subscribed via the mailman list are allowed to post. > > > > > > > > _______________________________________________ > > Haskell-Cafe mailing list > > To (un)subscribe, modify options or view archives go to: > > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > > Only members subscribed via the mailman list are allowed to post. > > > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post. -------------- next part -------------- An HTML attachment was scrubbed... URL: From facundominguez at gmail.com Fri Sep 28 16:15:54 2018 From: facundominguez at gmail.com (=?UTF-8?Q?Facundo_Dom=C3=ADnguez?=) Date: Fri, 28 Sep 2018 13:15:54 -0300 Subject: [Haskell-cafe] Status of Cloud Haskell In-Reply-To: References: Message-ID: Hello Alejandro, distributed-process has the minimum maintenance necessary to keep it running since we are not actively using it in Tweag's projects. > It Cloud Haskell still the way to go for distributed processes in Haskell? I'm sure there are many opinions on how networking should be done. But as far as I'm aware, distributed-process hasn't been superseded by other technologies yet. Best, Facundo On Thu, Sep 27, 2018 at 3:05 PM Alejandro Serrano Mena wrote: > > Dear Café, > I was wondering what is the current status of Cloud Haskell. Their website only mentions things until 2016, but I see that the 'distributed-process' library is itself well-maintained. It Cloud Haskell still the way to go for distributed processes in Haskell? > > Regards, > Alejandro > _______________________________________________ > Haskell-Cafe mailing list > To (un)subscribe, modify options or view archives go to: > http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe > Only members subscribed via the mailman list are allowed to post.