Arc Forumnew | comments | leaders | submitlogin
The power of function combinators (awwx.ws)
14 points by aw 5504 days ago | 6 comments


2 points by rntz 5503 days ago | link

Have you used Parsec? It's Haskell's parser-combinator library, and I noticed quite a few similarities. Haskell's typeclasses also turn out to provide a lot of general abstractions that can be applied to parser-combinators as well; for example, your 'on-result is approximately equivalent to Haskell's 'fmap, which is the function for genericised mapping. (And of course there's the fact that the parsec parser type is a monad, but I won't get into that...)

On a side note, Parsec is a little different in nature, because it distinguishes between a failure without consuming input (soft) and failure after consuming input (hard); soft failures fall back to the closest enclosing alternation, while hard failures fall back to the nearest enclosing "try" block, and then become soft failures. This means that, if you're careful, you can write with Parsec without having to worry about the exponential blowup typical of the parser-combinator approach. I'd be interested to see whether your JSON parser has any such pathological cases.

-----

2 points by aw 5503 days ago | link

I have used Parsec. One of the most dramatic differences is that Parsec supports backtracking, and so it's a more powerful parser. Since parsing JSON doesn't need backtracking, I got to avoid both the complexity of implementing backtracking and the need to carefully avoid exponential backtracking blowups :D

-----

4 points by aw 5504 days ago | link

This is an updated and expanded version of my earlier "A parser-combinator approach to parsing JSON" writeup.

-----

3 points by conanite 5504 days ago | link

This is an awesome piece of work - not just the code, but the whole tutorial/guide.

Previously (iirc) you simply used the input stream to keep track of the parse position - now you manage the input as a list of characters. I presume this is to allow backtracking from failed matches, or is there another reason?

My gut reaction to the idea of converting the input to a list was "OMG scaling issues!!!" but I must admit I've never had to deal with a really big json object.

-----

4 points by aw 5504 days ago | link

Thanks!

Actually, all the various versions of my JSON parser have converted the input to a list of characters. I wouldn't be surprised if it did in fact turn to be slow, but I see it as an optimization issue: until I have (or someone else has) a program that's actually slow because of it, then I don't know if it's a useful optimization to work on. For example, if JSON parsing were a thousand times faster if I wasn't using lists, but JSON parsing was only 0.1% of the program's total execution time, then I might not care.

-----

2 points by coconutrandom 5501 days ago | link

Thank you so much!! I realized that this wasn't new. However, I just now was wondering how to parse json with arc and then this post popped up. Plus a cs lesson included!

That's easily worth my $20. Rock on!

-----