[xmonad] Problem CPP'ifying XSelection.hs

Spencer Janssen spencerjanssen at gmail.com
Sun Sep 21 02:29:14 EDT 2008


On Sat, Sep 20, 2008 at 11:33:43AM -0400, Gwern Branwen wrote:
> So per a suggestion from sjanssen, I was refactoring my XSelection.hs to optionally use the 'decode' from utf8-string. (When I originally was writing it, utf8-string wasn't even optionally available, so I had copied in the decode definition.)
> 
> I finished editing, and everything looked dandy, but everytime I compiled, or loaded into GHCi (with :set -DUTF8), it fails to compile! And it fails to compile in a way that really perplexes me:
> 
>     Could not find module `Codec.Binary':
>       Use -v to see a list of the files searched for.
> 
> Line 34 reads:
> 
>     import Codec.Binary.UTF8.String (decode)
> Note that the two lines disagree on what is being imported...
> 
> I tried reinstalling utf8-string and X11-xft, thinking perhaps that was the problem, but that is not it. I can load utf8-string modules fine in GHCi; I can swap out that import line for 'import Codec.Binary.Anythingelse', but the moment I use .Utf8.*, it fails. And I've looked over my changes several times, but it looks to me to be the same as the CPP usage in Fonts.hsc, for example; and if -DUTF8 isn't set, it works fine.
> 
> I am a little stumped. I can't darcs send it because I don't know if the code is broken or not - it could just be my system. Find the patch attached.
> 
> --
> gwern
> Bluebird 5707 Kosovo Zemin XM Guppy Internet NVD ABC SGI


I believe that the UTF8 is being replaced by "1", which breaks the import
statement.  This seems to work in Font.hsc, I suspect that is because hsc2hs
uses a different C pre-processor.  The solution is to change our defined name
from "UTF8" to something else ("USE_UTF8" for example).  Be sure to update all
other files that use the "UTF8" macro.


Cheers,
Spencer Janssen


More information about the xmonad mailing list