Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Great write-up! I expected this to reference a few "normal" language features like static typing, operator overloading, or generics, but instead it was a list of some really neat off-the-beaten-track features:

  * Run regular code at compile time
  * Extend the language (AST templates and macros);
    this can be used to add a form of list comprehensions to the language!
  * Add your own optimizations to the compiler!
  * Bind (easily) to your favorite C functions and libraries
  * Control when and for how long the garbage collector runs
  * Type safe sets and arrays of enums; this was cool:
    "Internally the set works as an efficient bitvector."
  * Unified Call Syntax, so mystr.len() is equivalent to len(mystr)
  * Good performance -- not placing too much emphasis on this,
    but it's faster than C++ in his benchmark
  * Compile to JavaScript


As a pythonista one thing that struck me in the code fragments is zeroes and ones appearing everywhere. It looks very easy to have off by one errors. (It is very rare to have off by one errors in Python due to the way counting and ranges work.)


I guess you could use fewer magic numbers if you want, for example instead of:

  proc createCRCTable(): array[256, CRC32] =
    for i in 0..255:
You can write:

  proc createCRCTable(): array[256, CRC32] =
    for i in result.low .. result.high:
Or:

  proc createCRCTable(): array[256, CRC32] =
    for i, v in result: # index, value
Or define your own indices iterator:

  iterator indices(x) =
    for i in x.low .. x.high:
      yield i

  proc createCRCTable(): array[256, CRC32] =
    for i in result.indices:
I think I should propose adding indices to the standard library.


It does strike me as odd that some Ada-isms are present like 'low and 'high, applicable to types and objects, but the ever useful 'range didn't get included. It is a testament to Nim's flexibility that it can be added, effectively as a one-liner. This really should be in the standard library.


It isn't magic numbers as such, rather the smattering of sometimes zeroes and sometimes ones. If it was always zero or one it would be a lot less likely to have off by one errors.

Python solves this by always starting from zero, and crucially not including the final number - ie range(0,3) gives 0, 1, 2 (no 3). You can also do negative indexing to count from the end - eg range(0, 3)[-1] gives the last element (2). The right thing happens when you mix the length of things into the arithmetic too - eg range(0, len(item)).


The 0s and 1s are specific to the examples. A range in nim `x..y` includes both x and y. Sequences are always 0-indexed. Now, if you want to start with 1 for any range you can.


I don't program in Python, but I thought one of the contention points the community had was the syntax of slices and such (starting at one instead of zero, or one sire of the range being inclusive, I can't recall). If that's true, I would imagine that would encourage for some off by one errors. Then again, maybe it's not nearly as contentious as it sounded in a few comments.


Slices start at 0, not 1. However they are inclusive of both boundary items. This matches how ranges are specified in Nim.


Slices in Python are not inclusive of the second index; lst[0:2] gives you a list containing the first two elements.


Everything old is new again.


Dont know why you have been downvoted, because this is a kind of "pascal rebirth - the mission".

Of course it borrow some things from Python.. but i dont know why this has to be bad. When i was a kid pascal was a thing.. and my first hello worlds were programmed in it (away from C), then in my teens, Delphi was a thing, relearn to program in it, and it kept me away from Visual Basic (ohh-hay)

Pascal/Delphi was so fun to program with (thanks Hejlsberg!), but somehow the tech world turned into this C syntax dominated world, because of the popularity of the Unix..

And i must say, playing a little bit with Nim, it make me feel that again.. the joy of programming..

And i've always feel Python kind of nasty, i confess, not very easy to read the code.. like a messy code comming from a five year old.. but Nim doesnt have that feel.. it feels like pascal in the old days.. pretty serious but also fun and pragmatic!


Care to flesh that out a bit?


When I saw that the top comment is a list of language features which are mostly extremely old, I thought that was interesting, so I pointed out how old those features are. And got hammered for it in spite of the fact that it's true.

Obviously, it doesn't matter whether those features are old or new. What matters is whether the language executes them in a compelling way. But almost every single one of those features has been a core aspect of programming languages which have been around since before most of us were born.

Also, the mods have strongly urged me to write shorter comments, and to write comments less frequently. And they've suggested that more than once. I try to oblige, and this is the result. So technically I can't flesh out my comments without going against their advice.

Of that list of nine features, I'm certain six are extremely old:

  * Run regular code at compile time
  * Extend the language (AST templates and macros);
    this can be used to add a form of list comprehensions to the language!
  * Add your own optimizations to the compiler!
  * Bind (easily) to your favorite C functions and libraries
  * Unified Call Syntax, so mystr.len() is equivalent to len(mystr)
  * Good performance -- not placing too much emphasis on this,
    but it's faster than C++ in his benchmark
I wouldn't be surprised if GC control and typesafe enums are also extremely old, bringing that up to "8 out of 9 features are old." But it's at least 6 out of 9.


Well, you have to combine just two of these features to leave Nim in a very small company of languages, mostly developed in the last decade. Good performance and meta-programming ("run regular code at compile time") used to be found at the opposite ends of the programming languages spectrum and Nim is unifying them in a single package - that's what people are excited about.


Good performance and meta-programming ("run regular code at compile time") used to be found at the opposite ends of the programming languages spectrum

Actually, Lisp has been executing regular code at compile time with good performance since the tail end of the 1980's. Maybe even before that. This is why I said "everything old is new again."


Depends on your definition of "good performance". Idiomatic lisp was never able to compete with the Fortran-s and C/C++-s of its age, but Nim is.


"Good performance": Within a factor of ~2 to ~4 of native C.

No one said idiomatic Lisp was performant. They said Lisp was performant. Seriously, it's not even difficult. It's not like it's a situation where it's easy to accidentally kill your performance by writing elegant code while having no idea why it's slow.

Optimizing a Lisp codebase is a straightforward exercise, no more difficult than optimizing a Lua or Javascript codebase.

If someone hasn't shipped more than a toy program with Lisp, it's easy to believe the opposite. But if you try it, you'll see that it's true.


To be a bit more specific as to where else these ideas are found:

  * Run regular code at compile time
Lisp, of course. Since very long ago.

  * Extend the language (AST templates and macros);
    this can be used to add a form of list comprehensions to the language!
Lisp, but also Dylan, Elixir, JS with Sweet.js and more. See for example here: https://opendylan.org/articles/macro-system/index.html

  * Add your own optimizations to the compiler!
I'm not sure here.

  * Bind (easily) to your favorite C functions and libraries
Almost every serious language has FFI for C. Nimrod offers a possibility to use statically linked libraries, which is not very popular, but for example Gambit Scheme does this since long ago.

  * Unified Call Syntax, so mystr.len() is equivalent to len(mystr)
Lua, somewhat. Also Dylan again.

  * Good performance -- not placing too much emphasis on this,
    but it's faster than C++ in his benchmark
Too many to count.

  * Compile to JavaScript
OCaml. And more: almost every language has a compile-to-JS project for it (although only a couple are workable).

EDIT:

> typesafe enums are also extremely old

Aren't those just sum types? This would make them as old as ML at least.

This of course doesn't make Nimrod any less interesting a language and some combinations of those features may be rather novel. Still, 'everything old is new again' I think is 100% true for all of these features. In reality you won't ever encounter truly unprecedented features in general purpose languages and outside of academia: one can go to LtU site for those. All the rest languages are built on really old (proven) concepts, which they reimplement and package in a (very) different ways. I have no idea why would you get downvoted for stating such fact.


> * Add your own optimizations to the compiler! > > I'm not sure here.

Lisp, via compiler macros: http://www.lispworks.com/documentation/HyperSpec/Body/03_bba...

> The purpose of the compiler macro facility is to permit selective source code transformations as optimization advice to the compiler. When a compound form is being processed (as by the compiler), if the operator names a compiler macro then the compiler macro function may be invoked on the form, and the resulting expansion recursively processed in preference to performing the usual processing on the original form according to its normal interpretation as a function form or macro form.


Simple... the list of languages that provide Lisp-level dynamism AND C level performance is rather short.


Some Lisps, some Smalltalks, modern C++ and D, Haskell, Clean, SML, OCaml, Dylan, Rust, Factor, Felix, some (AOT compiled, commercial) Java implementations... and so on.

Only very tiny fraction of ideas or implementations are really new in programming language design. There is "prior art" for nearly everything, sometimes dating back to sixties. And being both dynamic and fast has been a research topic for decades now. The problem is hard, but we're steadily making progress - as a result we created a lot of (both abandoned and still alive) languages along the way.


Those are generally 2x to 5x slower than C.


Have you ever actually used Lisp, or are you going off of speculation? I have. It can deliver just short of C level performance.

This is like someone going onto a forum and saying "Java is slow!" while ignoring the fact that trading institutions use Java as their primary language for stock market trading. And they wouldn't do that if Java was slow.

I'm trying to be as patient as possible, but this is really getting out of hand. Most people in this comment thread simply have no idea what they're talking about.

Specifically, look up "GOAL Lisp". Gamedevs wouldn't use Lisp if Lisp wasn't capable of providing performance comparable to C. The entire company would've died.


Yes, I have. Lisp code with all the tweaks to even think about approaching C in speed is ugly, ugly code.


GOAL code was many things, but "ugly" is a stretch. Jak and Dexter did more than any game of its time, and the only reason they were able to do that is because of the flexibility of the codebase.

Its performance wasn't merely on par with C, but actually surpassed C in many cases. For example, they were able to stream content from disc dynamically while the game was running, and no other engine had that capability. That sounds unrelated to "performance" since it's a design decision, but in fact "performance" is everything which the end user cares about, such as load times. And the only reason the game engine performed well was Lisp. Many of Jak and Dexter's competitors didn't survive, performance problems being one of the reasons. (Game publishers used to cancel projects if development milestones aren't being met, such as a convincing tech demo. So if a tech demo was unconvincing, e.g. due to extreme performance problems, publishers basically killed the company. Maybe that's even still true as of 2015, but it was definitely true in 2001.)

EDIT to your reply: You're right, I edited my comment and added a second paragraph. Also, I apologize for my earlier tone. It was uncalled for and needlessly argumentative. Sorry.


You keep making this argmuent without really any justification besides "it just is". Everything I've read about Jax and Dexter says that it's pretty much amazing it was actually released because ND spent so much time on the esoteric, non-standard, bus-factor-one compiler-cum-game engine that they had very little time to actually make the game.


I have no idea why would you get downvoted for stating such fact.

And not merely downvoted, but karma carpet bombed (-10). Such an honor used to be reserved for troll comments rather than true statements.


Isn't compile-to-JS mandatory nowadays?


Ah, yes, I remember back in 1959 when we all was rocking the Cobol and I used to auto compile straight to Javascript with a Future module so magnificent that it auto-compile-manifested a V8 JS engine with which to run it. Deal with that boring Nim compiler.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: