[Haskell-cafe] Cons of -XUndecidableInstances
Scott Lawrence
bytbox at gmail.com
Mon Jun 6 08:21:25 CEST 2011
Oops. I can just abandon the Entropy typeclass and put the function
directly into Model, eh? Yeah, I think I'll do that...
Supposing I didn't want to - any alternatives? Other instances of
Entropy I might consider:
instance (Eq a) => Entropy [a]
instance (Eq a) => Entropy (Tree a)
On Mon, Jun 6, 2011 at 02:13, Scott Lawrence <bytbox at gmail.com> wrote:
> On Mon, Jun 6, 2011 at 01:52, Yitzchak Gale <gale at sefer.org> wrote:
>> Scott Lawrence wrote:
>> You almost never want to use UndecidableInstances
>> when writing practical programs in Haskell.
>
> Ah. That's what I wanted to know :P
>
> (Although it does seem to me - from looking around docs and the source
> - that GHC's rules for allowing certain combinations might be a bit
> too conservative - but then, I have next to no idea what I'm doing, so
> hey.)
>
>> When GHC tells you that you need them, it almost
>> always means that your types are poorly designed,
>> usually due to influence from previous experience
>> with OOP.
>
> * hides behind book
>
>>
>> Your best bet is to step back and think again about
>> the problem you are trying to solve. What is the
>> best way to formulate the problem functionally?
>> That will lead you in the right direction. Please
>> feel free to share more details about what you are
>> trying to do. We would be happy to help you work out
>> some good directions.
>
> I'm modelling text in a markov-model-like way. I have an actual markov
> model (albeit one in which X_n depends on a fixed range X_n-1 ..
> X-n-k). I'm vaguely anticipating the presence of other models:
>
> class Model m a | m -> a where
> lexemes :: m -> Set a
> genFunc :: m -> [a] -> ProbDist a
>
> Having that working, I'm trying to estimate the information entropy of a model
>
> entropy :: (Model m) => m -> Double
>
> (This is a slight simplification, since entropy needs a second
> argument "precision" to know when to terminate.)
>
> Which works well and fine - this function is pretty trivial to
> implement, on the assumption that Markov (the instance of Model
> described above) implements genFunc properly. But it happens not to -
> the array argument to genFunc must be the right size, otherwise an
> even probability distribution is used. So my OOP-infected mind wants
> to specialize 'entropy' for Markov:
>
> class Entropy d where
> entropy :: d -> Double -- again, simplified
>
> Note that it's not (Entropy d a) because the type of the lexeme
> doesn't matter. Now, the problem code
>
> instance (Model m a) => Entropy m where
> entropy = undefined
>
>
> As you might have picked up, I suspect the part where I want to
> specialize entropy for Markov is where I mess up - but I'm not sure
> what to do. (To be clear, I expect to want to specialize entropy for
> other models too - the general function I have in mind would be
> horribly slow for many reasonable models.)
>
> Thanks.
>
>>
>> Regards,
>> Yitz
>>
>
>
>
> --
> Scott Lawrence
>
--
Scott Lawrence
More information about the Haskell-Cafe
mailing list