A technique I stumbled on while implementing a bunch of crazy tricks in arc-on-mzscheme (just to return to the principle of true axiomatic development) is to create fake scheme-side objects that act as Arc functions (you do have to change the 'disp and 'write functions, and probably handle errors yourself, so that the user doesn't know you're faking functions). Something like this:
(define (composer fl fr)
(vector 'composer fl fr))
(define (composer? fn)
(and (vector? fn) (eq? 'composer (vector-ref fn 0))))
(define (composer-fl fn) (vector-ref fn 1))
(define (composer-fr fn) (vector-ref fn 2))
; 'compose is now a reduction on '<base>compose
(xdef '<base>compose composer)
; this is where the speedup comes from!
(define (ar-funcall0 fn)
(cond
((composer? fn)
(let ((fl (composer-fl fn))
(fr (composer-fr fn)))
(ar-funcall1 fl (ar-funcall0 fr))))
...))
;etc. for ar-funcall1 to 4 and ar-apply.
> but this simple macro does not integrate with defm or pat-m (yet).
It doesn't have to: the 'p-m modifier will adapt to it. All it needs is an args . body somewhere at the end of your macro's arg list - which you do have. p-m will then be able to adapt.
All patterns would have to be of the same length, and ideally they should be ll(1) patterns, so they could dispatch on the first arg. Or should the pattern matching occur after the function got all of its arguments?
What I would really like to see is currying with partial function application and dispatch on every function argument, so that (my-curried-fun (annotate 'foo val)) is evaluated to a curried function for the type foo. Otherwise the (types|patterns) would have to be compared everytime this function is applied, which may be inefficient, e.g in
> i miss its awesometastic tables when using other languages.
Would you mind expounding on how "awesometastic" they are? I gather that they have a nice implementation of vectors hidden under the table data structures (so conceptually a vector is just a table, but the runtime optimizes away vector-like usage into a true vector). Does it have more awesometastic properties?
i think it's the fact that those concerns don't occur in the language to begin with. from what i glanced (i can't find the pdf anymore,) the interpreter switches between different representations depending on how the particular table is used, and can switch between the representations dynamically for the given table (i'm not sure if the switching is bidirectional)
but the user never sees any of that. if you want an array or tuple you type
local blah = { 1, "two", function return 3 end }
if you want a hash table you type
local blah = { bleh = "yep", bleagh = "whatever" }
-- both of the following are valid
print( blah["bleh"] )
print( blah.bleh )
basically, any sequency, key-valuey, or structured ish thingie, i can just write out without worrying about anything. i'm sure it's doin' some sort 'o magic under the hood, but to that end they could be sacrificing lawn gnomes for all i care
and keep in mind that Lua is likely the fastest interpreted language... or at least it was before the Javascript optimization race started. i think one of the Javascript engines has features inspired by the Lua VM
and then there's metatables, which are properties that can be set to control the behavior of values in different situations. for example, one of the properties is __index, which is referred to when a table doesn't have a value for a given key. this enables straightforward memoization, inheritance, hidden data management, infinite data structures, etc
other metatable properties determine the operation of the value under addition, comparison, etc
you could, for example, implement complex numbers as if they were part of the language, simply by creating a seed table called i. when an expression such as 1 + 2 * i is reached, the __mul property would perform the complex multiplication, then return a new Complex table which would in turn perform the addition. along each step it would be a normal table, so you could do, for example:
print( (1 + 2 * i).polar_form )
keeping in mind that the table access is dynamic, hence polar_form can be calculated each time it's accessed, no need for getting/setting. also, this:
print( 8 * i^5 + 2 * i )
would work as it should because there's a __tostring property
i've been thinking of making an APL-like DSL (would be nice to be able to implicitly map and whatnot) in this sort of manner. i haven't looked deep into Ruby but i believe it has things of this nature
the fact that tables are used everywhere is unoverstatable. environments are tables, therefore you can do powerful things with environments. like writing your own import/include function, reading and writing global state in a file, anaphora and implicits... pretty much a bunch of crazy shit
along with this some very well-chosen features such as coroutines, closures, the implementation of most language features as modules (eg file.open, coroutine.wrap) to keep the syntax clean, etc
the tables by themselves go a long way, especially with their It Just Works® all-purpose syntax. but the way this flexibility is encouraged through the rest of the language makes it a wonderful unified package
Assuming you want to use showpos (which is arguably slower since you have to traverse the list each time):
(mac w/collect body
`(accum collect ,@body))
(def observe (x y r)
(w/collect:for i (- x r) (+ x r)
(for j (- y r) (+ y r)
(awhen (showpos i j)
(collect it)))))
If you want efficiency and still want to use list structures, you might want to skip 'showpos.
Good call. Not necessarily want to use showpos.. its main advantage right now is that it nicely returns nil when I try to read any out-of-bound position.
I guess without showpos I could grab the appropriate rows from world and discard the first X and last Y positions.
From here on I adapted the function to the one I really needed (do something if one of the 8 neighbours of posx, posy is X). Because of that requirement it seemed easier to just gather the row - 1 and row + 1 and the row, col - 1 and row, col + 1, and join those in a list. This is what it ended up as:
(if (find X (flat:map [join (errsafe (world _))] (list (- row 1) (+ row 1) (- col 1) (+ col 1))))
; do stuff
)
The boring stuff, like building nice parameterized SQL queries and getting back the data from SQL. Launching a system process in parallel and keeping track of its status (and potentially aborting it if e.g. it takes too long)
If we do all of the boring stuff in a clean, concise way, that makes everything easy, with the option of adding macros on top to boot, the boring stuff might well become fun, or at the very least, painless.
Some times ago I started a GTK+ binding, now "paused". It's more boring than I thought initially. If you wish look at it for a starting point (file gtk.arc in Anarki). I now think a binding towards tcl/tk would look nicer and easier to use, though.
These would require a standard FFI system. Or else we would end up writing Anarki specific code. Such a fork would be a real Arc killer (in the bad sense of the term).
sacado built an FFI on Anarki.... well forks are generally bad but with PG asleep until october or so .... (maybe he's getting ready for oktoberfest or something ^^)
Probably means we want to have some sort of "source" slot too, so that we can display shadowed values. Hmm. The association sublists e.g. '(key1 . val1) can probably be directly shared, but lists also imply an ordered set, so we need to store that info too. Hmm.
>As for strings and symbols, I don't really know how they are actually implemented, but as far as I know, the idea is that 'foo and 'foo are the same memory location, while "foo" and "foo" are not necessarily the same object(thus allowing string mutation)
This is correct. And when you really look at it, changing strings as if they were arrays of characters is hardly ever done in Arc; usually what's done is we just read them off as an array of characters and build a new string.
> In any case, characters just seem useless...
Another lisplike, Skill, uses 1-char length symbols for characters (i.e. no separate character type). Also, many of its string manip functions also accept symbols (although they still all return strings).
Another point to consider is that if your a-list is very small (<= 5 elements) it could be faster than hash tables. The sharing behavior could be achieved with some sort of concatenated hash-tables, a list of tables to consider in turn to find the desired element. This seems very slow though. BTW, removing a-lists would be useless: they're so simple to implement that a lot of developers (me included) would re-invent them to use in their applications.
That is nice that anarki has thread-local storage... I had stability issues with anarki so I've just been using arc2 with some patches lately... but I've definitely been missing thread-local storage...