It compiles a major subset of Apple's Dylan (http://lispm.dyndns.org/documentation/prefix-dylan/book.anno...) to JavaScript, both for use on a CommonJS implementation and in the browser. A bootstrapping compiler is implemented in JS, but the same compiler is also available in Ralph itself and features define-macro (Cl-like). The whole runtime is defined in Ralph as well and provides a single-inheritance object system (including next-method): https://github.com/turbolent/ralph/blob/master/src/runtime/c...
I notice that it doesn't support multiple inheritance or multimethods; out of curiosity, is that because of a principled objection to them, or more because it wouldn't be convenient to implement them?
At the beginning I tried implementing various object systems. The first one was indeed multiple inheritance with multimethods, based on C3 linearization
(http://en.wikipedia.org/wiki/C3_linearization) and didn't use the prototype chain, but wasn't finished. The second one was similar to Clojure's protocols (define-protocol, extend-protocol, ...), but wasn't very handy.
The current one is single-inheritance, because it uses the prototype chain. It's a compromise between speed and usefulness. I'd prefer having multimethods (and maybe also multiple inheritance), but speed is a bit more important, as JavaScript is already quite slow.
So far I'm quite pleased with the single-inheritance and single dispatch solution, which basically works like that: https://gist.github.com/866506
Man, I forget how nice it is to be able to look into the source of stuff.
Other than being obviously new, this is pretty cool.
Edit: Would there be a better way to fix it than to add functions like this?
function adder() {
var args = Array.prototype.slice.call(arguments);
return args.reduce(function(a, b){return a + b})
}
function subtractor() {
var args = Array.prototype.slice.call(arguments);
return args.reduce(function(a, b){return a - b})
}
function multiplier() {
var args = Array.prototype.slice.call(arguments);
return args.reduce(function(a, b){return a * b})
}
function divider() {
var args = Array.prototype.slice.call(arguments);
return args.reduce(function(a, b){return a / b})
}
(I haven't really put too much thought into this, but would love to hear of stronger approaches or obvious bad ideas in this one)
In principle that is a nice way to do it, and given that reduce is implemented natively on some platforms it might even be sufficiently performant, but to be honest I'd probably just write them as loops mutating a local variable. A language runtime is one of the places where a little bit of readability can be sacrificed for performance (although of course, one should measure it to make sure the gains are worth the cost).
Someone has actually already submitted a pull request doing just this:
With regard to maturity, Fargo is one week into development. I'm not even sure if its just a quick hack to show off an idea or if it will become a production language. Certainly I have a lot to learn about compilers and VMs before that happens.
Any reason why not to implement call/cc and then write fibers in terms of call/cc? The implementation may not actually be any different from what it is. (I've not studied the code enough to understand how it implements fibers, although I got the impression that it handles call frames explicitely (source/fargo/runtime/stack{,less}.js).)
I was going to do call/cc but fibers are cheaper. The implementation is very similar, but because fibers can only be resumed once from the last yield you don't need to copy the stack when yielding and resuming.
They also require the user to explicitly start a fiber. This means when you're not in a fiber you can use a faster stackless engine because you don't need to track the state of the current continuation.
BTW what's interesting about yield based versus lazy evaluation (functional stream) based sequence generation? Advantages of the latter are that they can easily be understood as sequences and thus further processed by lazy versions of the known sequence processing functions (map etc.), also they can be re-read multiple times, which you just made impossible for yield by not basing it on call/cc :).
I've been thinking about this. I might try out making all the primitive functions understand promises. So not lazy evaluation per se, but having the core library transparently deal with asynchronous values. Maybe monads would help but that really requires a decent type system.
Hmm, I actually hoped for Clojure on Node.js (don't get me wrong, Scheme is beautiful and this seems to be quite a capable implementation but Clojure is way more practical) but hmm, I hope this gets popular.
Clojure is a big language, and pretty deeply tied to the JVM; I'm not sure how you'd put it on V8. Scheme is pretty practical, though, once you get used to it.
The first version of Clojure was a Common Lisp compiler that compiled Clojure down to JS to be run on Rhino. There's interest in the Clojure community to have a Clojure "Light" that runs on V8 or SpiderMonkey.
I haven't time to play around with Clojure yet, but what makes you think it is more practical than scheme? The primary advantages seem to be JVM interop (which I don't think would be applicable on in a Node.js setting), and easy multi-threading (also not applicable?)
Hmm, that might be true. I don't like the java interop much, as it confines me to the JVM infrastructure that I really dislike.
Yeah, it might be mostly the functional bias. I was just reimplemting persistent immutable hast tables when I said, well dammit, why not try Clojure. And well, despite that I don't particularly like many things like lacking TCO, no continuations, and mostly probably the Scheme macro system, there is a lot of things that they did better.
I really like the data-structure access syntax (no more list-ref), I like that I have generic operations that work on any datastructure supporting the seq abstraction, I can see the point of reducing the number of parents in forms, I truly love that partial/curry is really part of the language, not in some implementation specific extension, I just love the -> 'operator'. Clojure is for me like Python, with much the functional programming support that Python did only get recently, with a community focus on this paradigm. Oh and with a great active helpful community in #clojure unlike the divided community of #scheme.
Before someone wonders: yeah, most can be implemented on top of Scheme, the point is that Clojure did it, and they are popular. Give me a Scheme with all the great things from Clojure and I'll never look back.
It compiles a major subset of Apple's Dylan (http://lispm.dyndns.org/documentation/prefix-dylan/book.anno...) to JavaScript, both for use on a CommonJS implementation and in the browser. A bootstrapping compiler is implemented in JS, but the same compiler is also available in Ralph itself and features define-macro (Cl-like). The whole runtime is defined in Ralph as well and provides a single-inheritance object system (including next-method): https://github.com/turbolent/ralph/blob/master/src/runtime/c...
Almost all of the features are shown in https://github.com/turbolent/ralph/blob/master/src/tests/run... and I'm using it a project now. To build HTML5 apps, there's a small toolbox: https://github.com/turbolent/toolbox
Maybe it's useful to someone else. Cheers