[Haskell-cafe] What other prelude to cleanly convert text-files into json
Bram Neijt
bneijt at gmail.com
Sun Mar 1 16:47:22 UTC 2015
Thank you all for the responses.
Qualified imports for a script of less then a page make the code less
readable. That is why I want to specifically import subsets of
functions and not prefix everything with qualified imports.
I've taken the code from Travis (clean up some of the useless
pack/unpack), decided to use "NoImplicitPrelude" and then I had this:
#!/usr/bin/runghc
{-# LANGUAGE NoImplicitPrelude, OverloadedStrings #-}
import Prelude (Show, IO, (++), print, String, filter)
import Data.List (isSuffixOf)
import Data.Text hiding (isSuffixOf, filter)
import Data.Text.IO
import Control.Monad (mapM_)
import qualified Data.ByteString.Lazy as BL
import System.Directory (getDirectoryContents)
import System.FilePath.Posix ((<.>), FilePath, takeBaseName)
import Data.Aeson
data Product = Product
{ image :: Text
, description :: Text
} deriving Show
instance ToJSON Product where
toJSON (Product image description) = object ["image" .= image,
"description" .= description]
encodeToJson :: FilePath -> IO()
encodeToJson srcName = do
let jsonName = takeBaseName srcName <.> "json"
contents <- readFile srcName
let contentLines = lines contents
case contentLines of -- head is unsafe! try your code on an empty file
(firstLine : restLines) -> BL.writeFile jsonName (encode Product {
image = firstLine,
description = unlines restLines
})
_ -> print ("error: invalid source file: " ++ srcName)
main = do
names <- getDirectoryContents "."
let srcNames = filter (isSuffixOf ".src") names
mapM_ encodeToJson srcNames
What bugs me is that this is a really simple thing to do in any other
language, but here I seem to be stuck between Prelude and Text and as
a beginner Prelude is king.
I tried ClassyPrelude but got stuck at not being able to get the
result of getDirectoryContents into the ClassyPrelude version of
mapM_. Sorry but I gave up on that.
I think that given all this, the above code is as beautiful as Haskell
can pull this off at the moment.
If somebody can prove me wrong, I would love to still see it.
Greetings,
Bram
On Sun, Mar 1, 2015 at 1:42 AM, Travis Cardwell
<travis.cardwell at extellisys.com> wrote:
> On 2015年02月28日 20:23, Bram Neijt wrote:
>> It quickly became a collection of "import qualified" and "T.unpack,
>> T.pack" calls that made the whole thing look ugly [1].
> <SNIP>
>> [1] Source can be read here: https://gist.github.com/bneijt/9bdb4b1759790a8463c9
>
> File paths are of type `System.FilePath.Posix.FilePath`, a type alias for
> `String`. Note that this convention is also followed in `ByteString` and
> `Text` libraries; they do not use `ByteString` or `Text` types for the
> file paths.
>
> In this code, file paths are packed only to have to unpack them again
> (twice!), which likely offsets any performance improvements of using the
> `Text` version of `isSuffixOf`.
>
> Here is a version using the same style but without packing file paths:
> https://gist.github.com/TravisCardwell/fd9981e4968e4af3751d
>
> I included a few other changes, which are commented in the code.
>
> By the way, I do not think that qualified imports are bad. I like them
> because they provide an explicit reference to the source module of a function.
>
> Cheers,
>
> Travis
> _______________________________________________
> Haskell-Cafe mailing list
> Haskell-Cafe at haskell.org
> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe
More information about the Haskell-Cafe
mailing list