I have a new appreciation for Lenses in typed languages. When you learn parsing untyped data like JSON to strong/soundly typed data, Lenses seem wrong.
When you’re presented with 1,000+ of dynamically shaped JSON data you can’t predict, typed Lenses suddenly are The Solution.
In Elm, lenses are an anti-pattern. The compiler offers guarantees using either high-level or low-level functions in elm/json.
This encourages you to fight hard for JSON the way you want it in your Back-end For Front-end (BFF) to make your front-end easier
However, in ReScript, they have a lot more flexibility, namely because ReScript’s JavaScript integration story goes well beyond “talk through safe ports”. You literally make calls back and forth, use the data natively, so have a lot more flexibility.
… still, even ReScript provides a lot of type safe facilities like Js.Json.parseExn and basic pattern matching.
Both languages, for good reason, eschew lenses because their typing is so powerful.
Yet when JSON is presented by users, not API’s you can control, what recourse do you have? Lenses. Dynamic languages in particular have a wonderful array of options here.
JavaScript has Optional Chaining natively now allowing you to safely dig amongst nulls in a pure way.
Still, for advanced composing of functions using the above, Ramda and Lodash/fp reign supreme.
While Python lags behind in the safe digging department, and None-aware is deferred, …I feel like some of their lens libraries and associated documentation are amazing.
Writing an SDK at work, and debated writing one particular library that makes it up in JavaScript or Python instead of ReScript for this reason. Building up tons of types just to use them to inspect dynamic data seemed… dumb. Why not just use existing lens libraries?
I almost quit twice, but I’m glad I stuck with it. While ReScript does offer community written Lens libraries, I wanted to do it by hand. You can learn a lot about a language’s ability to interact with dynamic data by creating your own Isomorphism.
I.e. text -> JSON -> type -> JSON -> text
Meaning, parsing some JSON from a text file over the network into strong types, making some modifications, and converting it back to JSON and then text to send back to a server.
Dynamic language libraries make this easy and fast.
However, the machinery around that inspection and modification is where errors can occur. While a lot more work, I’m glad I stuck with types. It ensures all the edge cases around the shapes of data not quite matching up (i.e. null and undefined being 2 different types), helped.
I’ve seen it argued that, at least for most use cases, Lens libraries are too much complexity, and it’s easier to just use simple gets/sets with Array.map and Array.reduce.
Lens show their power when you compose them so for basic parsing, I get the resistance if you’re just doing simple parsing.
For example, here’s a reasonably safe Isomorphism in bumping your package.json version in JavaScript using Lodash set.
const bump = () =>
fs.readFile('package.json')
.then( buffer => buffer.toString() )
.then( JSON.parse )
.then(
json =>
Promise.resolve(getOr("0.0.0", "version", json))
.then( version => string.split('.') )
.then( ([ major, minor, patch ]) => [major, minor, patch + 1] )
.then( versions => set("version", versions.join('.'), versions) )
)
.then( JSON.stringify )
.then( fs.writeFile('package.json') )
Here’s an equivalent using focused:
const json = iso(JSON.parse, JSON.stringify)
const versions = iso(
string => {
const [major, minor, patch] = s.split(".")
return { major, minor, patch }
},
versions => versions.join(".")
)
const bump = () =>
Promise.resolve( json('package.json') )
.then( jsonObj => set(_.$(json).$(versions).patch, x => x + 1, jsonObj) )
.then( fs.writeFile('package.json') )
It saves you maybe 1 line of code. The value is more about the ability to compose those iso’s more easily. If you’re not? Just use the native code.
What I was interested in was each and every possible problem in that original Promise chain as I need to know the various problems to mark data depending on what problem occurred, and some I can fix ahead of time with compiler support. TypeScript’s variadic tuples can help here, too, not just ReScript.
In conclusion, when I discovered Lenses, they provided a wonderful way to get pure code with dynamic data. As I moved to soundly typed languages, all the Lens libraries I saw seemed overcomplicated & dumb. Now I realize I was wrong and they have their cemented place when I cannot control the JSON.
Top comments (0)