But it's more important whether a language can do something nicely, which I don't necessarily claim Lisp can. After all, Turing machines can do all that, too. :)
would it collide if other people did something else with their braces?
You could hypothetically package the reader changes into their own modules so that you'd need to explicitly say which braces you're using, modules keep their changes local, etc. Anyone know of work done on this front?
Yes it's like python's op-overload but a bit better because of __index and __newindex.
So are __index and __newindex roughly like in the following (silly) Python code?
$ python
Python 2.5.4 (r254:67916, Jan 24 2010, 12:18:00)
[GCC 4.4.3] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> class foo:
... def __init__(self, x):
... self.x = x
... def __getitem__(self, index):
... return self.x + index
... def __setitem__(self, index, value):
... self.y = index + value
...
>>> bar = foo(5)
>>> bar[10]
15
>>> bar[20] = 5
>>> bar.y
25
If so, I imagine that overloading table operators in Lua becomes all the more powerful since tables are so ubiquitous. Thanks for the explanations.
You could hypothetically package the reader changes into their own modules so that you'd need to explicitly say which braces you're using, modules keep their changes local, etc. Anyone know of work done on this front?
Hmm, giving the job of custom readers to modules strikes me as making implementing both modules and custom readers complicated and difficult; and makes unnecessary impositions on the programmer: I might well want to implement one part of my module using one syntax and another part of my module using a different syntax.
On the other hand I think being able to choose your custom reader by source code file would simple to use, simple to understand, easy to implement, would make it easy to customize your editor or IDE to understand your different syntaxes by file, and means that modules can also be easily implemented as they could do their work entirely after the source code has been read in.
I wonder if modules could be implemented with namespaces? You want to call my function foo which calls my function bar, but you don't necessarily want to have to name it "foo" in your own code and you want to be able to have your own "bar" without breaking my function foo. I'll have to think about that one.
I wonder if modules could be implemented with namespaces?
I guess I use the words "module" and "namespace" interchangeably. I'm not sure what else "module" ("package", "namespace", sometimes "library" (loosely), etc.) would mean.
(use "braces-for-set-builder-notation.arc")
(load "a.arc")
(def set-I-really-want ()
{ (* v v) for v in (vals (table-I-really-want)) })
You wouldn't want a.arc's use of braces for tables to leak into b.arc, which uses braces for set-builder notation (which you further don't want to clobber the braces in the definition of table-I-really-want).
With a fancier module system, you might even have
b.arc
(qualify "braces-for-set-builder-notation.arc" s)
(load "a.arc") ; doesn't qualify its braces for tables
(def set-I-really-want ()
s{ (* v v) for v in (vals (table-I-really-want)) })
(def another-table-I-want ()
{foo 5 bar 10 baz 15})
Though there shouldn't be a reason you couldn't do something like
c.arc
(use "braces-for-table.arc")
(def some-table ()
{x 5 y 10})
(use "braces-for-set-builder-notation.arc")
; braces hereafter are for set-builder notation
(def some-set ()
{ x for x in (vals sig) })
As Arc lacks a proper module system, you can already see the clobber-some effects of definitions being global/unqualified. APIs leak all over the place:
arc> (load "sscontract.arc") ; only really needs to provide sscontract
nil
arc> (priority #\!) ; but the user gets all of the implementation details!
1
Similarly, I don't want to load some library and have implementation-detail reader macros polluting the reader. (If the library's specifically for reader macros, then of course that's okay.)
Even some simple public/private control would be nice. I've played around with this in Arc before. E.g.,
(let localize nil
(assign localize
(fn (expr fs)
(when (caris expr 'def)
(if (~mem (cadr expr) fs)
`(assign ,(cadr expr) (fn ,@(cddr expr)))
expr))))
(mac provide (fs . body)
(let private (trues [and (caris _ 'def)
(~mem (cadr _) fs)
(cadr _)]
body)
`(let ,private nil
,@(map [localize _ fs] body))))
)
arc> (macex1 '(provide (f g)
(def f (x) (h (+ x 1)))
(def g (x) (+ (h x) 1))
(def h (x) (* x x))))
(let (h) nil
(def f (x) (h (+ x 1)))
(def g (x) (+ (h x) 1))
(assign h (fn (x) (* x x))))
arc> (def foo () (prn "outer foo") nil)
#<procedure: foo>
arc> (provide (bar)
(def foo () (prn "inner foo") nil)
(def bar () (prn "bar") (foo)))
#<procedure: bar> ; note: no redef warning, since def changed to assign
arc> (bar)
bar
inner foo
nil
arc> (foo)
outer foo
nil
But this particular approach doesn't interact with macros nicely, doesn't let you qualify namespaces, isn't general, etc. So it's really suboptimal.
I find this line really interesting. If I'm reading it right, you can alter the metatable (operator overloading) on the fly? So you could pass in a table as a parameter, set a metatable, play with some operators, then set a different metatable to overload operators differently?
In Python, the operators are kind of tied to specific classes. I guess you can go about casting and converting and inheriting and monkeypatching and such, but I don't think it's as straightforward as just setting a new metatable. Admittedly, I can't think of when I've ever needed to alter operator-overloading dynamically like that, so I've never really tried. Do you know if it's useful in Lua code? Or am I completely off-base here?
Hmm, the only times I've used it as where I'd be casting objects in other languages... I've used it to 'cast' between 'tables' doing similar things. like `my1.rect` to `my2.rect`, if they both use .x, .y, .width and .height to store their coordinates. (Not sure that's a good practice because they can change.) Yes, you can alter metatables any time, but I don't do it that often.