[Haskell-cafe] Using lenses
Thu Oct 3 21:19:29 UTC 2013
> > Lenses for nested ... types ...
The most compelling uses I've seen for lenses is back to Benjamin Pierce's
[et al] papers on "Updatable Views". I think this is where the 'theory'
started(?), although similar ideas had kicked around the relational
database world for some time.
So, a trivial example of that would be treating co-ordinates as
simultaneously polar and cartesian. This connects to Wadlers paper on
Views, which became Haskell's view patterns.
I guess, though, that in a short talk, Simon won't have time to dig into
all that history.
I see that many of the suggestions in response to Simon have talked about
deeply nested structures. (Including for JSON and XML so-called databases.)
Now certainly there are applications for nested data, where the topic
involves hierarchies. I can see that for AST's and (E)DSL's, organic
molecules, exploring a game tree, ... nesting is the natural structure.
Given that you have a nested structure, lenses are a really neat approach.
I'm going to offer a contrary view: lenses are a solution to a problem
that never should have happened. A problem I'd characterise
as 'Unnecessary Complexity' [per Fred Brooks].
The data processing industry abandon hierarchical databases in the '80's,
because the relational model is so superior. In ten years time, I expect
that XML-socalled databases will be regarded as a similar aberration.
One of the reasons is redundancy: XML so-called databases repeat field
content all over the place. And that's going to give update anomalies:
change some of those places, but fail to change all of them.
Now formally, hierarchical and relational data models can be made
isomorphic. So I'm not criticising the ability to capture data. I am
criticising the ease of access (for fetch and update). You end up with
your code for the 'business logic' getting muddled in with the code for
navigating the hierarchy, making both very brittle to maintain. Given that
nested data gives you (especially) an update headache, lenses help you.
But a better solution is to do the data analysis up front, apply standard
normalisation techniques, and deal with 'flat' data models.
And for flat data models, I don't see lenses having any advantages over
old-fashioned records. (But! We do need to solve the Field names problem.
It's great that Adam's GSOC work has incorporated lenses.)
More information about the Haskell-Cafe