I'm only halfway through the presentation, but it's really interesting.
Some thoughts: the grouping in the let form seems interesting, but I worry that it'll just be more confusing: wrapping all the args of e.g. let (c.f. Arc's with) with a single set of parens is a recipe for disaster, and I'm wondering how that works with values that are themselves a function call work. Does this work:
(let top (+ (- b)
(- (expt b 2)
(* 4 a c)))
bottom (* 2 a)
(/ top bottom))
Possibly: you could only make the first body argument to let the first odd item in the arguments to let that has parens, and not simply the first item in the arguments that has parens, but that prevents this from being working code:
(let a 7
a)
At least, it'll do something unexpected. Either way, I don't see these grouping parens as too complicated to understand. I do see how it overloads the meaning of parentheses, but 1. I don't see that as a problem, and 2. you could simply say that curly braces are exclusively for grouping. Either way is more readable.
The grouping symbols are much harder to parse than parentheses. Parens are bigger than characters, and our brains can easily pick them out. A symbol can't be so easily found, especially when it is mixed in with other symbols.
I like the idea of making awesome developer tools, including highlighting, but not everyone uses the same tools. How long will there be before there's an Emacs mode for the language? A vim plugin? How many people won't use one of those, or the text editor distributed with the language? It'll be hard to post code online to arbitrary sites, as it won't be syntax highlighted. This will be especially hard if you're trying to show people how nice your language is: "here's some code, isn't it cool?" is an easier sell than "go here, then look at the code, then see that it's cool".
Implicit parens? That's insane! This is a parse error!?:
{foo a b c}
That seems incredibly more readable than the "proper"
{foo a b c
Anyway, I'm interested in what happens here. I'll have to watch the rest tomorrow.
"The grouping symbols are much harder to parse than parentheses. Parens are bigger than characters, and our brains can easily pick them out. A symbol can't be so easily found, especially when it is mixed in with other symbols."
When I use grouping symbols, I choose symbols like "-" and ":" which skirt that issue. I've considered setting a language-wide policy to use these symbols exclusively for grouping, but I haven't yet designed a language where that policy would fit in. (It would fit into Arc or Scheme, for instance, but I didn't design those. :-p )
---
"you could only make the first body argument to let the first odd item in the arguments to let that has parens, and not simply the first item in the arguments that has parens, but that prevents this from being working code"
The technique I use in Lathe's 'xloop is to greedily consume variable-and-anything pairs from the body. This terminates either when there's a non-variable at an odd location, or when there aren't enough elements left to make a pair, whichever comes first.
On a more extreme note, I'd kinda like to see most binding happen in 'do. If a variable-and-anything pair appears in a (do ...) form, that binding should hold for the remainder of the form. I implement this in Lathe as 'lets.
---
"Implicit parens? That's insane!"
I don't get it either. :) But if the idea is to have a "there are only a few ways to write it"[1] approach to syntax, that would explain both the implicit paren enforcement and the mandatory four-space indent.
By the way, I think you might need to write your 'let example like this:
let top (+ (- b
- (expt b 2
* 4 a c)
,bottom (* 2 a
/ top bottom
"Possibly: you could only make the first body argument to let the first odd item in the arguments to let that has parens, and not simply the first item in the arguments that has parens, but that prevents this from being working code:"
There's actually a very sane way of handling this... You just say that "let" has only a single body expression, so as soon as you find a mismatched pair, that must be the body. That's what my language does with "$lets":
$lets: a 1
b 2
c 3
foo bar
The above is equivalent to this Arc code:
(withs (a 1
b 2
c 3)
(foo bar))
If you want to have multiple expressions in the body, you just use "$do":
$lets: a 1
b 2
c 3
$do: foo bar
qux corge
This is like how Arc requires "do" if you want to evaluate multiple expressions in an "if".
---
In any case, I have to agree with you. The syntax rules seem somewhat convoluted and confusing. Naturally I'm biased, but I prefer the way my language handles the situation: it has very few parens, but it does so in a much more flexible, and I believe simpler way. I've put quite a bit of work into making the syntax work well, and will continue to do so.
By the way... certain user conventions in my language mostly obviate the need for the syntax highlighting scheme that he came up with. In my language:
1. Vaus are prefixed with $ (like in Kernel)
2. Predicates end with ? (like in Kernel)
3. Non-referentially-transparent things end with ! (similar to Kernel)
4. Local variables start with a capital letter (like in Shen)
These rules combined means that it's trivial to write a syntax highlighter for my language, and it means the information is available even if you don't have syntax highlighting. To demonstrate, here's a 1-to-1 translation of the "factorial" function (shown in the video) into my language:
$def factorial; N ->
$loop: Cnt N
Acc 1
$if: is? Cnt 0
Acc
$recur; --Cnt; Acc * Cnt
Naturally I wouldn't define it that way in my language, but... you get the idea.
Certainly there are some nice ideas in there, like giving special colors to external variables and such forth... but I think languages should look nice and be usable even without any syntax highlighting at all, with syntax highlighting as just an extra convenience, nothing more.
---
By the way... here's the idiomatic way to write "fact" in my language:
$def fact
0 -> 1
X -> X * (fact X - 1)
Inefficient (because it isn't tail recursive), but even so, it's simple. If you want a tail-recursive version...
$def fact-acc
0 Acc -> Acc
X Acc -> fact-acc X - 1; X * Acc
$def fact: X -> fact-acc X 1
Or maybe you want to be like Haskell[1] and define it as...
The colors on this machine aren't quite right, but it shows what I care about:
a) Comments since they're never evaluated
b) Literals since they eval to themselves
c) Parens and ssyntax -- mostly as delimiters, but with backquotes distinguished
Everything else is unhighlighted. If the language does its job I really shouldn't be thinking about whether something's a macro. And local variables ought to be the default, so why add a little salience to Every Single One?
---
Wart comes with the vim settings for this highlighting: http://github.com/akkartik/wart/blob/2e01126102/vimrc.vim. It's very smart about ssyntax. The colors really indicate precedence. Notice in the second statement how some colons are colored like ssyntax, but not others. Or how the exclamation in mac! at the bottom isn't colored like ssyntax.
But after all that I don't want to make too many assumptions about how a new reader will view one's code. It needs to be visually balanced even without highlighting. Your typography rules remind me a little of early wart. See the if macro at the end of http://www.arclanguage.org/item?id=15137 -- and your comment on http://www.arclanguage.org/item?id=15140 :)
"If the language does its job I really shouldn't be thinking about whether something's a macro."
This is the same argument we had before... it's just not true. Macros/vaus behave fundamentally different from functions, they are not the same thing. By making them stand out, it gives your eyes something to grab onto.
Humans are wonderfully good at noticing patterns, but only if there's enough information there to pattern match on. If you don't provide this information in the syntax, it adds additional mental overhead.
You now have to memorize whether something is a vau or not (for common things like $let this isn't a problem, but for things less commonly used it can be a pain). The same goes for locals: if it isn't apparent in the syntax whether a variable is local or not, you have to mentally scan up the scope chain every single time you glance at code.
One of the problems with Lisp is that due to its lack of syntax, there's very few patterns that your mind can pick up on, so you have to do a full-blown mental parse of the source code just to determine whether something is a local or a vau or whatever.
It might not seem like much, but all the tiny extra mental overheads do add up. I believe that once you get used to my syntax, it's easier to read source code, because just by glancing at it your mind can notice all the little patterns.
Of course, there might be better criteria other than fn/vau/global/local/predicate/mutation... if so, I'd be interested in hearing it[1]. But I do think, whatever criteria you choose, it's important to have it be visually apparent so our poor human brains don't have to work so hard to parse our code. We are visual creatures, let's give our minds some visual feedback to chew on.
---
"And local variables ought to be the default, so why add a little salience to Every Single One?"
You forget that most functions/vaus are global. In fact, in my language, roughly 1/2 of the variables are globals, with the other 1/2 being locals. Out of those globals, roughly 1/3 are vaus, with the other 2/3 being fns.
You're right, there is a fine line between adding syntax to make the source code more readable, and adding syntax so it ends up looking like Perl. I've tried to add in syntax only when I feel there's a significant benefit from doing so.
I'll note that my language doesn't have that particular problem mentioned in that particular post because my language doesn't have quasiquote/unquote/unquote-splicing, so that example would just be `@Body`
---
By the way, I used to be really off-put by how Kernel uses `$?!` in symbols to give them special meaning, and I also really disliked how Shen has local variables start with a capital letter... but after trying it out for a while, I got used to it and found that it actually wasn't that bad after all. Now I think it's an overall net win.
---
For comparison, here's how my syntax highlighting for my language currently looks:
* [1]: In particular, I just realized that it might be better to use $ for constructs that introduce additional binding names. This might be more useful than a general vau/fn distinction.
Then again, after looking through the source code, there were only a handful of vaus that didn't introduce new bindings: and, catch, hook, if, or, and quote
So, given how most vaus apparently exist for name binding, I think it's best to just use the general vau/fn distinction.
"Macros/vaus behave fundamentally different from functions, they are not the same thing. ..it adds additional mental overhead."
Functions, macros, they're just ways to get certain behavior in the most readable way possible. Perhaps they add mental overhead in kernel because it's concerned about hygiene and such abstract matters.
Wanting to track your macros is OCD like wanting to avoid namespace pollution is OCD. Just relax, use what you need, remove what you don't need, and the function/macro distinction will fade into the background.
"Functions, macros, they're just ways to get certain behavior in the most readable way possible."
Sure. And their behavior is different: macros/vaus don't evaluate their arguments, functions do. That's because they're used for different purposes, so distinguishing between them is important and/or useful.
---
"Perhaps they add mental overhead in kernel because it's concerned about hygiene and such abstract matters."
I don't see what hygiene has to do with it... we're discussing about making it easy to tell at a glance whether a particular variable is a function or a vau, that's all. That's true regardless of whether the vau is hygienic or not.
I'll also note that I have not actually programmed in Kernel, so all my talk about "mental overhead" is actually referring to Arc, which is a distinctly unhygienic language.
In any case, my gut says that making a distinction between vaus and functions is important, so that's what I'm doing.
---
"Wanting to track your macros is OCD like wanting to avoid namespace pollution is OCD. Just relax, use what you need, remove what you don't need, and the function/macro distinction will fade into the background."
I do indeed worry about namespaces, which is why my language is going to have fantastic namespace support, most likely built on top of first-class environments.
Not only does this allow people to write solid libraries that don't need to worry about collisions, but it also has the massively major benefit that you know exactly what a variable refers to, because each module can be studied in isolation. You can't do that when everything is in one namespace.
So this has the same benefits that lexical scope and referential transparency give you: you can study different subparts of the system in isolation without worrying about what another part is doing.
Incidentally, that's why dynamic scope is so bad: it's not enough to understand what a single function is doing, you also need to understand what the rest of the program is doing, because some other random part of the program might change the dynamic variable.
That's why "lexical by default, marking certain variables as dynamic" is superior to "dynamic by default": it increases locality because you don't need to jump around everywhere trying to figure out what everything does, you can just focus on one part of the system at a time.
That's the whole point of functional programming, and my language is intentionally designed as a functional language. In fact, I plan for all the built-in data types to be immutable as well, for the exact same reasons. This should also help immensely with concurrency, similar to Clojure.
"I'll also note that I have not actually programmed in Kernel, so all my talk about "mental overhead" is actually referring to Arc, which is a distinctly unhygienic language."
That is really interesting, that our respective experiences are so different.
I'm with you on "lexical by default" -- I'm not totally crazy :) But the simplest possible mechanism that provides the similar advantages of namespaces is to just warn when a variable conflict is detected, when a global is defined for a second time.
I'm trying hard to introspect here, and I think the difference between lexical scope and namespaces for me is that when I'm programming by myself I don't need a second namespace, but I do still find dynamic scope to be error-prone. My entire belief system stems from that, that one should program as if one was working alone. Everything that helps that is good, anything that isn't needed is chaff.
I like the highlighting; the color makes the typography less jarring. But you're right, it's one of those things one should familiarize oneself with before judging.
Oh it's very simple. The operators + - * / < > <= >= are the only infix operators (for now). They have the usual precedence rules that other languages use.
How they work is, they take one expression on the left, and one expression on the right, and then wrap em in a list, so that `X + Y` becomes `(add X Y)`, and then "add" is the actual add function.
So they're just syntax sugar for common infix operations, that's all. That's why the last example passed "mul" to "sum" rather than "*".
I got the sense that the unbalanced parens are just a pedagogical device (though I'm not sure they helped me understand it any better). I didn't actually see any code samples with unbalanced parens. The final factorial example still had balanced parens which would be parse errors according to his earlier slide.
Above all, I'm left with a frustration that there isn't code to play with so I can clarify my confusions on my own and not get held up by ambiguities in some presentation. This is a common complaint of mine. Why do people do this? Why are they so concerned with getting it right before they're willing to let the world see it? If you throw it out earlier, who knows, somebody might come and contribute earlier. I'm more motivated to contribute when something is half-baked. Once you've figured it all out, it hardly seems worthwhile :)
"This is a common complaint of mine. Why do people do this? Why are they so concerned with getting it right before they're willing to let the world see it? If you throw it out earlier, who knows, somebody might come and contribute earlier. I'm more motivated to contribute when something is half-baked. Once you've figured it all out, it hardly seems worthwhile :) "
This part of what you're saying could well be reasoning in favor of discussion before code. Coding is a process of developing ideas, but so is discussion. This person has let the world see their ideas in the form of a presentation, rather than obsessing over getting them "right" in the form of runnable code first.
Unless... you're not even asking for runnable code? Interesting. Are you asking for people to be comfortable enough to do public brain dumps of all their works-in-progress, regardless of how useful they expect them to be?
No I want runnable code. But isn't all code somewhat runnable? Otherwise it wouldn't be code. Almost any project is runnable within a few hours.
Your argument assumes that the presentation is less work than code, but I don't think that's true. He's clearly put hours of effort into presentation, but there isn't enough for me to even be clear on what he's proposing. Code would be unambiguously concrete in this respect. Even if it only works some of the time, if it has bugs, etc. I'd be able to get a sense of how it ought to work.
"Almost any project is runnable within a few hours."
"Your argument assumes that the presentation is less work than code, but I don't think that's true."
I take this a little personally, because there are many projects where I still don't even know what I want several years in. :) Well, programming is all about knowing what one wants, but I mean I don't even know these projects well enough to identify the core program I should start with. But I like to think I thrive on these ideas, because interesting big projects are the main reason I even give a second thought to little one-day projects.
Also, programming has a skill aspect to it. Unless someone's used a certain tool or technique before, it can be frustrating and intimidating. I personally find several things frustrating that others take for granted, like Emacs, Vim, manual memory management, the command line, and yes, riding a bicycle. :-p If someone's not ready to code up even a hackish language yet, I can relate.
I certainly didn't mean to make it personal. I didn't even think I was talking about you.
I vaguely sense that we're using very different meanings for words like "runnable", "less work", "program", "project" and "right". But now I'm afraid to pick at this further.
I'm not claiming there should be no discussion without code, or that people must have working code when making a proposal, or anything nearly that strong. In this case from the certainty and polish of the presentation I assumed he knows what he wants. And he's referred to code so we know it's not a pure spec. So the bottleneck seems to lie in me understanding his proposal. And I was suggesting that sharing whatever code he has might help me over that hump. Showing code can only ever help, never hurt.
"I certainly didn't mean to make it personal. I didn't even think I was talking about you. [...] now I'm afraid to pick at this further."
Oh, sorry. I'm personally invested in this topic, but I'm not offended. But come to think of it, my post was a few claims fluffed up with personal foibles in place of other justification, and thanks for not being eager to refute the acceptableness of my foibles. :-p
---
"In this case..."
I don't have much of an opinion in this particular case. I was spurred on by the "common complaint" that people don't share their code in progress, and I'm interested in what kind of overall strategy we should pursue in response.
- Social networks for code sharing (e.g. package managers, HTTP, GitHub)?
- Collaborative development of large-scale online worlds (e.g. Wikipedia)?
- Socially encouraging or discouraging people to program depending on their personality?
- Investigating what kinds of programming problems are so mathematically exotic that meaningful code is exactly the thing that's hardest to develop?
- Different laws and licenses related to sharing code?
---
"And he's referred to code so we know it's not a pure spec."
I don't remember that part. I did skip a few boring parts in the video. ^_^;;
Hmm, maybe I even consider runnable code to be relatively boring and forgettable. XD; Probably depends on whether it's a product I'm eager to use right away. ^_^
Weird that they want it to be functional, yet discourage recursion. I wonder if you could write the language without using modification or recursion.
Also, given that functions are first-class, how do they prevent things like the Y combinator? I figured I'd try it out, so I cloned the repository, but it seems to be only usable on the developer's Windows machine. base.res seems to be Windows-only; it has the xml token <assemblyIdentity type="win32" name="Microsoft.Windows.Common-Controls" version="6.0.0.0" processorArchitecture="" publicKeyToken="6595b64144ccf1df" language=""/>. Also, the only file with an extension I've even seen before is ppas.bat, which calls a bunch of executables in C:\lazarus\fpc\bin\i386-win32 .
Has anyone gotten this thing to run? There are no instructions on actually running it. The author seems really concerned with showing people what the language looks like without letting them try it out.
I'm motivated like I usually am by things on the internet -- "you're wrong, let me show you how". If functions are first class, then you can't prevent recursion. I think the Y combinator (or the U combinator, maybe?) can make recursion happen quite easily.
I didn't have the heart to tell him that 1. the language isn't Lisplike, which is a dealbreaker for me, and 2. I don't have room in my life for two languages no one uses.
I don't think he's actually written much in the language (which is somewhat understandable, since the damn thing isn't implemented yet), and that's causing problems because he hasn't realized how annoying simple things like his any statements become.
I haven't asked him this yet -- I'd like him to answer a question I've asked three or four times now first -- but why do some statements run in reverse order? The definition of let and get from http://code.google.com/p/aha-programming-language/wiki/Getti...:
>The statements after the where clauses are evaluated in the reverse order (i.e. from the end)...
This just seems bizarre, confusing, and not at all useful. But we'll see.
If a where clause's binding expression ran after the body in an eager language, we couldn't use the value during the body. We'd be limited to using it from inside closures (manual laziness).
I do have ideas, but none that make sense. :-p It does seem better to embrace top-to-bottom side effect ordering throughout all syntaxes of the language, with as few discontinuity points as possible.
But if we want to embrace that, it's probably hypocritical if we use prefix syntax for procedure application. ^_-
Yeah it'll be interesting to see how it goes. I'm actually more favorably disposed after the thread on the list. His motivation is a reasonable one: to make the static shape of code more closely mimic how it looks at runtime. This requires constraints so it feels kinda wild and whacky. But I'm glad somebody's trying these ideas.
Running statements in reverse order isn't such a big deal. Concatenative languages seem backwards, and even preorder is kinda weird at the start.
The best new languages require so much tolerance to the bizarre that it hurts. Alan Perlis said, "A language that doesn't affect the way you think about programming is not worth knowing." I just realized he didn't say, "A good language.." Potential dead ends can be worth knowing as well.
So, how'd it go? Did someone end up presenting Arc? I've been considering presenting Arc to the Lisp group in NYC, if I can put together a five-minute presentation for Lispers who don't know much about Arc.
It was very fun, but no talk on arc. I ended up talking about http://arclanguage.org/item?id=16378. I'd love to see a presentation on arc; I wasn't sure five minutes was nearly enough.
That sounds like a good talk. Anything beyond your post? I must admit that I haven't put enough time into the macro-apply situation, so I'm not even up to speed with your post, butI'd love to see slides if you have them.
My reason for giving a five-minute talk about Arc is -- well, beyond the usual "I don't know if I can do a longer one" -- I've talked to a few Lispers about Arc, and they're not very excited about it. In fact, it's denigrated. I wanted to give a short talk to convince people they want to learn more. Maybe not 5 minutes, but 15? I don't know.
Yeah I have other, less politically-correct, reasons not to want to talk to lispers about arc :) The reasons for not wanting to talk are themselves hard to talk about.
I'm not interested in trying to convince lispers about anything because they are pretty close-minded about their tools. Trying to evangelize anything risks a minor flamewar, and even if you avoid flamewars you end up thinking you might have better spent your time talking about something else.
At yesterday's talk there was the obligatory old dude speculating that I might be able to accomplish my already-eval shenanigans using reader macros on backquote. In itself there's nothing wrong with the idea. It's totally false[1], but that's ok. The bigger criticism is that I could predict ahead of time that I'd get some question like that. There's a huge bias in the sorts of questions that get asked by CLers, and that predictability is utterly boring. I find questions about why I don't use Apple similarly boring.
Crotchety old lispers are fun to talk to. Often I learn some gem I couldn't find anywhere else. At the risk of sounding condescending, I'm aware I'm on my way to being a crochety old man myself[2]. I've just learned to avoid certain topics -- and to change the subject when others don't know to avoid them -- so we can all get along.
Talking about lisps is like talking about text-editors. Perhaps it's more important to focus on what we do with our tools. A cool app in arc will be more effective at getting lispers to engage with arc than any words we can come up with. But you'll still have to fend off questions about why you couldn't just use Common Lisp for the purpose. Snort.
That said, I still would love to watch you try to explain arc to non-arc folks. Sometimes explaining a thing helps understand it better. The audience will almost all take it constructively. You'll just need to put up with the prospect of the occasional miscreant. And if you do you'll be a better man than me.
[1] I don't even need to know anything about reader macros to make that assertion, because my feature isn't just some syntactic sugar. It requires semantic support for fexprs. CL has no fexprs. QED.
[2] My girlfriend says I must have been 60 when I was born.
---
I don't want to sound like it was a terrible time. I had a fantastic time there, lots of great conversation with a lot of interesting people, and will definitely go back the next time. But even at a lisp meetup the best conversation is only tangentially about lisp. There was one lightning talk about building a distributed RDBMS inspired by Connection Machine principles. The guy didn't build it in lisp, but he figured lispers might be worth showing it to. And that is awesome; I think many people got a lot out of it.
I found this on HN. Comments here: http://news.ycombinator.com/item?id=4035849 . Unfortunately, it's too late for me to understand and comment. So I'll have to do so later.
Thanks for hosting this site in the first place! It's really useful. Is there anything I can do to help out?
Last I checked, you couldn't paste into the repl (I can't check now, unfortunately, as it's down). That feature will make a huge difference to me -- my main use case for tryarc.org is telling people about arc, and they won't type in an example I give them, especially if it's more than a two-line toy. What needs to happen to get this to work? Again, can I help?
Good news - copy and paste is finally working now (just fixed it). So now people will be able to paste in your examples! :-)
Thank you for the encouraging/motivating comment! Fixing copy-paste has been on my list for a long time, but this finally got me to do it.
I appreciate your offer to help. Maybe I should open-source Try Arc and then people can start issuing pull requests when they're motivated to fix something.
Awesome! That'll really help me show people arc code, as I'm rarely at a computer with Arc installed when I want to show people.
Open-sourcing Try Arc would be cool. I'd love to look at larger Arc projects -- all of mine are middling at best. I'm working on bigger ones, but the lack of libraries is...frustrating, so I'd love to see how other people work with Arc.
Generally, I'll try to show it to some of my programmer friends, or to people at work.
I had actually come up with an interview question: take two strings and return the index of the first difference. I posted some arc code, and wanted to have people run it.
My code:
(def first-difference (seq1 seq2 (o comparer is))
"Returns the index of the first difference between seq1 and seq2, or nil if
they are the same. This function uses 'comparer as the function to compare
elements of the sequences."
(withs (len1 (len seq1) ;; 'withs ([var1 val1]* ) binds each var to its val
len2 (len seq2)
helper (afn (seq1 seq2 pos) ;; 'afn makes an anonymous function that
;; can recurse by calling 'self
(if (is len1
len2
pos) ;; at end of both sequences
nil
(or (is len1
pos)
(is len2
pos)) ;; at end of only one sequence
pos
(~comparer seq1.pos
seq2.pos) ;; found a difference
pos
(self seq1 seq2 (+ pos 1)))))
(helper seq1 seq2 0)))
Edit: no, that's not right, because it doesn't return nil on identical sequences. Here are two variants with and without the accumulator. Which is easier for noobs?
It's interesting to compare this to the solution in my workday language. One of the reasons that M is shorter is because its functions are more tolerant. For arc, I had to make sure that the index to the string was not too large. In M, it simply returns an empty string if you attempt to access a string position beyond the length. And I used iteration with M. Perhaps the arc solution would have been shorter had I done the same there.
Are you concatenating strings? Is that where returning "" is useful? In wart concatenating strings is tolerant of non-strings, so I think returning nil should be ok.
If the string X = the string Y, print 0 (to indicate there are no differences / MUMPS uses 1-based indexing of strings)
Otherwise, go through each character of the strings and at the point where they differ, print that index and quit looping.
MUMPS does not error out on attempting to extract a character beyond the string's length. So in that event, 1 string's extract will return the empty string, which will differ from the other string's extract and will cause you to print the index and quit. Not generating such an error, of course, cuts down on the code needed.
So if we enter the for loop, we know there is a difference between the two sequences.
So we just need to find an index position i where (isnt seq1.i seq2.i) is true. If indexing off the end of a string causes an error, we need to put in code to prevent that. If indexing off the end of a string returns anything that can't be an element of the string, we can compare each index i until we find a difference, knowing that we'll eventually find one.
That worked for me with Try Arc. It's really not a large program, the portion that I wrote. On the backend, all the heavy lifting is done by Racket's sandbox library and srv.arc. On the front-end, it's Chris Done's jQuery console (and to some extent now, WordPress).
What don't you like about the way Arc handles temporary files? Is that you can't have another file named that? Arc could use `(file-gensym)` instead, if that was a thing.
Thanks for the tip! I had no idea there could be a better way. Now I find that my use of $RANDOM in shell scripts should be replaced with calls to mktemp as well.
Now tofile does the 'write to tmp file and move' dance. writefile is implemented in terms of tofile. Finally, the tmp file is more randomly chosen, so it should be thread-safe.
I wonder what they do for security. It's standard that you can't trust data coming from the client; I hope they don't blindly evaluate code coming from the client.
I suppose you could encrypt the ANF'd stack, then send it over the wire.
I think what comes from the client is mostly an opaque token that refers to a continuation in the server's session store. Probably something like news.arc, but I'm even less familiar with news.arc than I am with this. :-p
I think other values are passed around as first-class values of the program, not as automatically-evaluated expression. So if you use eval in your code or otherwise interpret rich user input with loose privileges, you might indeed have a security risk on your hands. If by some chance you do find yourself in that position, Racket also provides a sandbox library so that you can replace your insecure eval with a more tightly controlled one.
I had made a patch to make HN (and AF) allow users to quote text. I tweeted it at pg (possibly rtm also; I don't recall), and posted it on this forum (http://arclanguage.com/item?id=14701), and nothing came of it. It's really too bad; markdown has quote functionality, and the things users currently do are quite inferior (two spaces make code, not quotes; a ">" character looks ugly and doesn't obviously stand out).
Hold on to the patch as it may still be useful. Who knows, something may come of a well coordinated movement by AF members to get the attention of pg and rtm. Even a 100 year language needs an active community. :)