Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Nim – Python-like statically-typed compiled language (nim-lang.org)
155 points by tomrod on Aug 16, 2020 | hide | past | favorite | 190 comments


I'm stuck in this cycle with langs like this. I want to use Nim I want to use Crystal they to me represent real tangible improvements over the languages they were based on but run into the issue of ecosystem and adoption every-time.

I know it's chicken or the egg but I have real problems choosing these languages when they just aren't as battle hardened or package wise versatile as say python or ruby when making decisions for my own tech stacks in a startup or for other companies I have or will work for.


Another problem I've found for me with Nim and Crystal is the lack of editor support compared to other statically typed languages like Java and C++.

Arguably a really nice benefit of these languages is the inline error highlighting, which speeds up development. I've found for Nim and Crystal I have to run the compiler in the terminal, look where the errors were, and then find those lines in my editor.

LSP support is minimal ( especially when a language is starting out ). My ideal editor for Nim would be Pycharm-like ( open source edition ) with an LSP backend.


> I've found for Nim and Crystal I have to run the compiler in the terminal, look where the errors were, and then find those lines in my editor.

I pretty much do this with any language I use, including C++ or even Java and Kotlin. I just use vim for everything, it makes it easy to switch around between languages without dealing with a new IDE, and I haven't used any autocompletion in any language in probably 10 years -- that aspect is over-rated.


In Vim I don't recommend running the compiler in the terminal, instead set makeprg and errorformat appropriately, and use the :make command and the quickfix list.


I’d appreciate a link describing this more fully if you have one handy.


Usually with Vim the manual is better written and more accurate than any external source of information, so I recommed using the command ":help quickfix". It explains usage of the quickfix list, the :make and :grep commands and how to configure the errorformat and makeprg variables.


True, though the VSCode Nim plugin "just worked" for me. The compile messages work too, I think using the Nim LSP [1].

1: https://github.com/PMunch/nimlsp


If the compiler is fast enough, you don't need LSP support; many editors can be configured to run any sort of compiler or linter for doing error checking, and can be configured to do so when saving or as you edit &c.

Vim, for example, has a plugin called "syntastic" that is kind of a meta-plugin aggregating lots of these, and indeed it seems to have support for nim error checking.

LSP is a nice step forward, but people had error highlighting for decades before it was invented.

[edit]

Doesn't change anything above, but someone pointed me at ALE which is a newer tool that can run lint engines asynchronously (using interfaces that didn't exist when syntastic was created). For vim specifically that may be a better tool, but I haven't tried it yet.


>> and can be configured to do so when saving or as you edit

That's true, but being able to quickly click on the line that has the error and have your editor navigate to the file automatically is slightly more painful to setup, and is bespoke for whatever tool you are using.

My point was just that a tight integration between language and IDE/editor makes the overall language experience much nicer. For example:

Racket/Dr. Racket Python/Pycharm Java/Intellij Emacs lisp/Emacs


>> That's true, but being able to quickly click on the line that has the error and have your editor navigate to the file automatically is slightly more painful to setup, and is bespoke for whatever tool you are using.

In the event that the tool doesn't output in one of the common formats, it's usually just a regex; again I had this in Vim for several different languages back in the 90s.

>> My point was just that a tight integration between language and IDE/editor makes the overall language experience much nicer.

I agree (happy SLIME user here), but it often seems like I'm in the minority. I've literally never met someone in person besides me that uses pycharm, and you are about the 3rd person online I've seen mention it in a favorable light.


ALE is quite nice. LSP is nicer-still, but ALE is a good step up in usability from syntastic.


ALE will use LSP servers when they are available


The LSP enables a lot more than diagnostic reporting


It sure does, but GP specifically mentioned diagnostic reporting as something missing when using the "new hot" languages


I don't get why more editors don't support parsing compiler output to get the file name and line for errors. Its dead simple, no need to run a server, just allow running the compiler from within the editor and have some way of configuring the parsing so it can work with different compilers. Gets you 90% of the value for 10% of the work (because you don't need to do anything to the compiler).


Indeed. Emacs and Vim have both supported this for a long time. I used it in vim in the late '90s and Emacs usually got features like this before vim.

I was working on integrating a language with a "programmers editor" <name omitted to protect the guilty> and asked how to extend the editor to indent a new language. The response was "it should already support all C/Java like languages with the configuration format." My response "this isn't a C-like language". The community response "Uh, I guess write a plugin that installs a hook on the carraige-return key and manually position the cursor?" It blew my mind that anything billed as a programmer's editor couldn't be customized for doing this already...


You don't even need to configure it in many cases. I've been doing it with numerous languages in Emacs, and I think I had to configure it (with a regex) maybe once for a compiler whose output it didn't already recognise.

The syntax compilers use to indicate filename and error location are similar among compilers, and generally quite unambiguous anyway, so in many cases no configuration is needed. A small list of regexes built in to the editor is enough.


Emacs has been able to do this for a very very long time


"Turing-complete text editor can do anything" :)

To be fair, I dont actually know what percentage of editors have that feature, my impression is just that many people arent aware of the idea. Maybe Im wrong though.


> which speeds up development.

Perhaps its because the product I work on is fairly mature at this point, but "speed" isn't something I really optimize my workflow for.

It's much more important that I solve the right problem than that I solve "a problem" quickly.

When the lifetime of the code you write is measured in years / decades, saving minutes seems much less relevant.


I should clarify that when I wrote "speed", in my mind I was thinking of code navigation.

I found that tight language integration to the IDE/Editor makes jumping into new code bases in the language much easier. I've found refactoring also becomes easier.

So I guess I agree with you that for the very common programming task of "Thinking about the problem" tight language integration doesn't help much. But in order for me to gather data points about the problem, I usually have to jump in and explore the code base, which is where I feel the advantage comes from.


> I've found for Nim and Crystal I have to run the compiler in the terminal, look where the errors were, and then find those lines in my editor.

Definitely setup your editor to run the compiler for you, and give you a list of errors that are links to the locations they occur.

In Vim, for example, you can configure the makeprg and errorformat variables and the use the :make command to run the compiler, which populates the quickfix list with the errors. From the quickfix list you can jump directly to the locations. In Emacs there is the M-x compile command with very similar functionality.


In the case of languages the chicken and the egg problem has been solved. Looking at every major PL with widespread adoption, the common theme is that was chosen by a big company (the chicken) as a natively-supported language for their platform. See: Obj-C and Swift with Apple, C# and C++ with Microsoft, Java with Sun, C with GNU, JS with Netscape, etc. If you build an ecosystem, they will come. Languages that rely on a community to build an ecosystem just for the sake of using the language itself are very few and far between.


C++ has also been the lingua franca of media and game development since the mid-to-late 90s.


SGI's STL implementation and later EA STL were game changers for C++ imo, they really kicked off "modern C++" as we knew it.


C++ was used in game development long before the STL was widely used or even reliably supported by compilers.

The object-oriented and high performance features of C++ make it a good match for games. Seems to me this makes Rust a fairly natural choice for high-performance games in future.


Rust by design tries to discourage you from using object-oriented patterns. Not saying you can't write games with it, but it'd look more similar to C than C++.


Interesting, I hadn't thought of Rust as discouraging object orientation particularly. But then, I think of the Linux kernel as object-oriented C in places.

If you look at Unity, although the games are written in C#, there is a heavy emphasis on mixins for giving functionality to game objects, rather than inheritance or interfaces.

Between mixins and the ECS pattern, I think Rust's trait-oriented style looks aligned with that trend.


But interestingly that was not the case for Python which is definitely in the top 5 most popular languages today. I wonder how/why Python accomplished this without any corporate backer?


Python always has had a few really strong killer apps. In the past stuff like Zope (GvR himself worked for a few years on this, and it's still around), and more recently things like Django, numpy, scipy, matplotlib, PyTorch or open scientific computing in general.

Python seems like a very popular choice when the time comes to replace bash or perl scripting from the 90s. Oodles of internal tooling and OS tools in the Linux space use it.


I don't understand why people needed or used Zope. Can someone explain?


Python was one of only 3 languages allowed at Google (along with Java and C++) back in the 2000s.


That's one exception I had in mind. It was a no-name language before Google adopted it in the 2000s, but that was internal and couldn't have had effects except the downstream effects on general sentiment around Python. Python was just so nice of a language compared to its competitors that the community created an ecosystem. That's very hard to replicate. A lot of these new languages I'm seeing are just incremental improvements.


That is what people tend think, the corporate backers were all the Guido employers, and back in the old days (Python 1.6 and such), Zope was the big reason to use Python.


Schools. There was ba major push to teach python in high schools and colleges.


Python, TCL, PHP, Perl, Ruby, Lua, and more. So it happens, just not recently.


Python, all the Guido employers, Zope was big during the early 2000's.

TCL designed at Sun, pushed on the Web early days with application servers like AOL Server and Revit (or something similar have forgotten the name).

Perl was introduced in UNIX since the early days and was the tool to use when sed and awk scripts got too complicated.

Ruby was made famous thanks to Rails, and gets very little use outside Rails.

Lua, used as scripting language in a couple of game engines, made famous via World of Warcraft.


TCL was pretty well established before Ousterhout went to Sun.


You could argue it was chosen by Linux, since it was bundled in most distros.


Was there any language that solved the chicken and egg problem using a nice ffi ?


Go, Rust, Python, Ruby?


Lisp, ruby, PHP (unless... WordPress? I'm pretty sure Facebook is "tail end" of php)...


>C with GNU

Instead of this pondering and retroactive application of intuition in one's head onto history it would help to learn history. (Hint, what is GNU's name)


I hear you.

For me it makes compiled languages fun again. I like rust but package maintenance gets rough.

Things like this are what I'm enjoying discovering: https://robert-mcdermott.gitlab.io/posts/speeding-up-python-...


Second that, I find Nim fun to program. Not sure exactly why.. Perhaps, it‘a a bit of the feel of the old Python 2 days, but with better package management, less OO, and static binaries. Or it has more advanced abilities that solve annoying points with static languages like C++ via proper macros.

Crystal looks nice too, but I want code that compiles to C/C++, not llvm for embedded work.

Not sure about Crystal, but I’ve been able to find all the packages I needed for embedded and even some ML work. That Nim can wrap C/C++ libraries really helps there, otherwise it’d be a pain. For example I wanted to load PNG images, well someone wrapped a C library [1]. Or to create Elixir NIFs (with exception safety!) [2]. Generally if there’s a C/C++ package it only takes a few hours to wrap and use.

Nim has some annoying parts, like understanding the ref/var distinction and lack of examples. The documentation is fairly complete otherwise.

1: https://github.com/define-private-public/stb_image-Nim 2: https://github.com/wltsmrz/nimler


Strange to see powershell core have such poor performance. Isn't it basically .NET in a scripting language? I've had great success grepping through gigs of log files with powershell, though I recognize that's apples to oranges.


It's neat but (a) it's not a reproducive example as it mostly about the stack and not the computations, (b) you would probably just use Numba for any serious one-liner speeding up in Python (at least for scientific-like computations) and gain cffi, simd, vectorization, interoperatiblity with numpy, typed containers, etc.


The "python is performant because you call into C" argument implies that whenever any fundamentally new algorithm is developed you need to do it in C. This is actually gatekeeping things like machine learning to the few that can build and maintain the C code for libraries like TensorFlow.

Python is the ultimate glue code scripting language, but you can't build a performant algorithm without an efficient C implementation underneath. With languages like Nim taking performance seriously, people could build a complete implementation of such algorithms from the ground up.

This doesn't just mean the core implementation would be easier to maintain. There's a huge gap between people who use the python libraries and people who build the efficient code that runs underneath. This clear division between those two worlds is what makes it difficult from people to jump from one side to the other, and that's why I'm calling this gatekeeping.

Not necessarily Nim, but using a performant, simple high level language for this sort of tasks would blur this divide. In practice, this means AI researchers in universities could dive into the code that's actually doing the work, not just play with the toy buttons and levers Google and Amazon left for them to play with.


?

Nim is compiled into C, IIRC, but it uses Nim syntax and rules and you don't call it "C". Same way, Python's Numba is compiled into LLVM IR at runtime, and you basically write Python, with a few limitations. You don't have to write or know any C/C++ to write high-performant numeric algos in numba (just clarifying) and there is no "C implementation underneath" as you're saying. It's also one of the easiest ways for "AI researchers in universities to do the work" because they and their friends probably already know Python but not Nim.

Re: ML libraries like lightgbm and many others - they are written in C++ so that there's a public C FFI which can then be wrapped in other languages, and not only in Python. This is probably the most flexible way to do things as opposed to limiting the whole thing to one niche/language.

// I'm not saying Nim is bad, Python is good, or any of that - I like alternative languages myself, but being an "AI researcher" and practitioner and spending most of my work time on developing numeric ML algorithms, I'd never look into Nim for doing any serious work, at least not now, partially because then I'd be the only person maintaining whatever I write alone and forever.


I stand corrected. I did not know numba and assumed it was similar to numpy and others, which do wrap C.

I still stand by my comment, though. We need to discuss technology by its own merits. Of course I'd also choose Python for an ML project any time of the day! But that's because Python won the popularity contest a long time ago. When discussing technology I think it's worth trying to see past that.

Exposing a C FFI may be flexible in the ways you mention (i.e. You can call the function from many languages). But I think we miss a lot on explorability. Let me elaborate a bit more with an example: Most people is not reallistically able to drill down on some implementation details when using neural network libraries like TensorFlow (which is not the whole field of AI, just an example!). At some point, if a feature is missing, you have to leave Python, learn a new language, and get a whole dev environment setup started. At that point, you're not using Python anymore, so I don't think it should count as a Python merit that you can do it.

That being said, I don't know Nim enough to validate whether using it for this would be a good idea.


You can define an expose a C interface to Nim libraries, as you can with C++. See https://github.com/c-blake/lc/ and the extensions/lcNim.nim, for example. lc happens to load it from Nim, but the produced shared lib could also be dlopen()d from C.


With Arc's "deterministic" memory management interfacing Nim DLLs from other languages should be better, although never tried it.

One trick that works as a guarantee for me is writing Nim code in almost pseudo-language style, which in theory will be easier to port to other languages, it's very easy to do.


Hm, Julia 1.4 takes 9.5 seconds on my machine (Intel i5 4460). Not bad at all.


Two ways to get infrastructure like libraries, editors, IDEs support and so on:

1) Language is sponsored by tech giant or

2) It slowly evolve over period of 15-20 years, provided enough people see usefulness even in face of lacking features/libraries/tooling etc.


Same. I was starting a new CLI project the other day and thought “ya know, I’ll try crystal instead of doing this in go.”

But then I got to the home page and saw “fund crystal and help make it production ready” and felt like it wasn’t a risk for something I needed to get out the door.

I did build a small toy project in Crystal and really enjoyed working with it.


For a while now I've been looking at using Nim, and this weekend I did my first project with it: a tool for creating binary wrappers. In Nixpkgs we often create wrappers, but those have thus far been shell scripts causing trouble at times on e.g. OSX. https://github.com/NixOS/nixpkgs/pull/95569

I found it easy to get up to speed and do something useful with it. The language is also in my opinion very readable. The documentation could use some more (larger) examples though.

For this wrappers tool I needed a front-end for which I wanted to create an interface with argparse. Unfortunately the standard library lacks an implementation, and third-party packages did not deliver what I needed. In the end I wrote that part in Python still.

The biggest issue I currently find is the lack of a lock file format for its package manager. It's being developed, and as soon as its there I intend to implement support for it in Nixpkgs. https://github.com/nim-lang/nimble/issues/127.

The compiler gives very useful output and is fast as well, and the generated binaries are small. I like this language!


If you use the TinyCC/tcc back end then compiling is nearly instant (I use the "tcc mob branch" all the time). You can even put a shebang at the top of your file (#!/usr/bin/nim r) and have an edit-run cycle similar to scripting languages, just a tiny edit toward "nim c -d:danger foo.nim" the file to get a production/deployment performance self-contained binary.


Nimph has had lockfiles for months.

https://github.com/disruptek/nimph


Thanks! I was only aware of Nimble. I'll have a look at it.

One thing I noticed right away that it seems to be lacking (judging from the `nimph.json` assuming that is the lock file) are checksums over the data or hashes of revisions the references correspond to. References such as tags are mutable, and thus hashes are needed to validate them. See e.g. the discussion over at https://github.com/NixOS/nix/pull/3216.


Please open an issue if you want to change the semantics; it's trivial to use the hash instead.



Nim is an amazing language that suffers from a lack of momentum and hype. Technically, there’s little reason why it shouldn’t be a good choice for many programming problems.

Reality doesn’t work like that, of course. Hype is important, and not just because that will mean more libraries. Hype motivates people to do things they wouldn’t have otherwise done at all. Rust demonstrates this well.


Is it hype, or controversy then? :P Latest development in Nim is reference counting is planned to replace the GC https://forum.nim-lang.org/t/6549. However this is done in a non-breaking way and their isn't much criticism, except the development effort and the countless bugs that were introduced. Hopefully this will lead to better multi threading which is a shortcoming of the language, thus opening new fields.


> Nim is an amazing language that suffers from a lack of momentum and hype.

And good multithreading and network libraries.


Shouldn't the title be "with static typing?"

Python has strong types. It's just that it checks them at runtime, not before.

If you don't use MyPy or similar, anyway.


It should. It's the same as with Common Lisp, which is strongly dynamically typed. Strongly, because it is impossible to confuse a given value's type via some sort of implicit casting; dynamically, since types are checked at runtime.


I think in modern parlance Common Lisp uses a gradually typed system, at least some implementations.

Common Lisp allows for type inference and some compilers use that to various degree, as well as to declare a type of a given symbol (or the signature of a function). Depending on compiler and optimization settings, that information is then used to generate optimized code, much like a compiler for a (weakly) statically typed language would or detect mismatches. SBCL is probably one of the most advanced Common Lisp compilers in this regard.


Sure, that's the case in SBCL - the type inference engine in it is rather advanced; with type declarations here and there (which are capable of turning Common Lisp into a weakly statically typed language), they are capable of type-optimizing a lot of code. It also generates warnings in case type mismatches are detected at compile-time.


That may be the case. I apologize, I am a tinkerer programmer and the semantics are sometimes lost on me.


Python's new typing is so useful. mypy works most of the time. Sometimes misses some things. Running in --strict is a bother (no one is going to annotate __init__ returning None thanks...)



Weird... Looks like you're right. Edit: aha, looks like this is the case under which it happens:

  class Testing123:
      def __init__(self):
          self.blah = whatever()


If anyone wants to know how Nim can be used in practice, you might want to take a peek at the NimSvg library. It’s a DSL for generating SVG files and GIF animations and to me it looks quite compelling.

https://github.com/bluenote10/NimSvg


The contents of the examples dir are pretty slick. I've never even touched Nim but these make more sense to my brain than svg. It says DSL but reads more like an improved alternate syntax. Thanks for sharing this and there goes my afternoon :)


It has a RISC-V backend in case anyone was curious. And it can compile down to C/C++! I have a C/C++ runtime environment that is very unique, and I don't even know how to begin explaining it, but the fact that I potentially don't have to build shims on top of it is just very interesting to me. I'm going to tinker with it.


Exactly, llvm doesn’t support many unique backends. Be sure to use the ARC garbage collector! Certainly checkout c2nim.


I like Nim. It's almost a year since version 1.00 of the language was released (in September 2019), but I wonder how the language has fared since then. Has usage grown since that key milestone? Gaining 'mindshare' among developers is obviously important but also difficult. Of course, supporting libraries are also important in encouraging adoption, but that is just one piece of the picture.

I see a pattern of sorts for some new(ish) languages: without a prominent, well-known sponsor, the language might struggle to gain attention among developers. I wonder what will be the fate of other languages without generous sponsors as they head to their version 1.00 e.g. Crystal or Zig?

Julia is doing relatively well without a big main sponsor - interest seems to be growing in the language.

Rust and Go are two languages that have benefited hugely from support from Mozilla (Rust) and Google (Go).

But back to Nim. Who's using Nim? What are you using it for and how are you finding it?


> Who's using Nim? What are you using it for and how are you finding it?

I am using it for game and music programming: https://github.com/paranim

Only been using it since the beginning of the year and it is fantastic. Much less development friction than rust, and perf is very good, especially with ARC.


I also using it for game programming :) A good game framework in Nim is https://github.com/ftsf/nico

Also for linear algebra and machine learning they're a couple of good libraries:

1) Arraymancer https://github.com/mratsim/Arraymancer

2) Neo https://github.com/unicredit/neo

3) my own :) not meant for when performance is required. https://github.com/planetis-m/manu


One of the near-production clients for Ethereum's proof-of-stake blockchain is written in Nim.


Didn’t know that! Any good articles on why the chose Nim?


It's called Nimbus, since it wasn't mentioned, here's their website: https://nimbus.team/ and why Nim is explained here: https://our.status.im/status-partners-with-the-team-behind-t...


Ooh, I didn't realize Status (the company behind Nimbus) is financially supporting the Nim team now.


The amount of PL support from crypto is insane!


Old but relevant: https://nim-lang.org/blog/2018/08/07/nim-partners-with-statu... In short, thanks to status funding Nim's development has accelerated since then.



Loads of interesting stuff available in the talks from NimConf 2020:

https://www.youtube.com/playlist?list=PLxLdEZg8DRwTIEzUpfaIc...


Worth mentioning is NimScript, a subset of nim which can be interpreted, and embedded like lua: https://nim-lang.org/docs/nims.html

If for some reason the nim compiler starts becoming slow, then this could also drastically speed up development


Yeah it really is a hidden gem. You can use it as a bash replacement, and with the new compiler API you can embed the entire interpreter in your program. My only complaint is that the interpreter aborts the process when the nimscript code has an error, so you can't really use it for hot code reloading right now. I hope they fix that eventually.


There is really no way to trap it? Maybe fork the NimScript stuff to give the option to try to hang around?


I did manage to fork it and prevent it from aborting, but i don't want to maintain it forever. I tracked it down to this function:

https://github.com/nim-lang/Nim/blob/cd28fe2ef7a204721efa720...

It's calling `quit` which cannot be caught in a try statement.


Other than having superficially similar syntax, is Nim really Python-like?

Also, Nim seems to be targeting a different niche from Python. As Nim is statically-typed and AOT-compiled, its closest competitor is probably Go. The difference is that Go deliberately tries to be a simple language, while Nim is much more complex.


I partly agree with you, Nim is a complex language. Just have a peak at the manual https://nim-lang.org/docs/manual.html , where Nim's features are documented.

However for me Nim appeared simpler when starting out. Not much get in your way when prototyping and scripting. Other languages require some mental overhead. I'm referring to Go's error handling, Java's mandatory exception handling, Rust's tracking lifetimes, etc. Thus it's much more pleasant to use than the rest.


Nim's pascal-inspired type system is very powerful it feel very different to me. Maybe "a better Pascal with Python syntax" would describe it better?


In what ways will you say Go is simpler than Nim? I did try out Go but found it a bit difficult. Maybe would need to spend some more time.


Nim has more features like generics and metaprogramming via templates and macros. You don't have to use them, but if you read someone else's code that uses it, you might have trouble understanding it, which probably doesn't happen with Go.


That's a poor excuse. You cannot do everything in just few things and would want things that help you do more, or create abstractions easily. That will require additional syntax. In Go absence of generics will make you write more code that will increase code length unnecessarily.


FWIW, my answer was not meant to be trashing nim. I like and use nim.

And to be perfectly transparent, I know relatively little of Go. My impression is that nim has more features in general, which could be considered to make it more complex. That can be a good and/or bad thing depending on perspective.


Go is adding generics, so you may have to read other people's code using them.


Nim does have a fair amount of Perl like "more than one way to do it" things. Like countup(), ".." and "..<".


Has there ever been an analysis of what percentage of total bugs can be prevented by static typing? I moved from static to dynamic languages and in my experience static languages dont provide too much upside but then I have always worked in smaller teams.




This is good read. In a nutshell, Would it be fair to say start with Python and move to Java when team grows to 50+?


To me the benefit of static types is more about providing help as I type. Anything that can make the error appear as early as possible as well as provide suggestions that makes sense makes me so much more efficient.


My experience: I used to interview and do leetcode in C++ and I could never match the speed of Python people. I switched to Python and I could.


What is easy or hard in a few hundred lines of code doesn't say much about what is easy or hard in thousands of lines.


My main gripe with Nim is that documentation (even for the standard library!) is in absolute shambles. Not to mention the lack of an ecosystem.

I think that authors of Nim libraries get so tied up in their superiority over having types and being compiled that they forget that most devs don’t really care about these things on their own. It’s only when these things are combined with good documentation and tooling (see Rust for example) that they become useful.


Hey that's unfair :(. Nim's dev team had done user surveys for three years now https://nim-lang.org/blog/2020/02/18/community-survey-result... and improved the docs significantly. Rust has excellent docs but also a larger dev-team.


I'd guess a fair amount of that is just because it's a much smaller team of folks, with fewer resources.


I liked the documentation when I was looking at it before. What are you talking about "shambles"?


When I started learning Nim I noticed the docs and even the getting started tutorial just threw the kitchen sink at you, it totally overwhelemed me.

So here's a simple project I wrote that's easier to understand. I work on this every couple of weeks on Saturdays.

https://github.com/sergiotapia/torrentinim

For example: parsing html https://github.com/sergiotapia/torrentinim/blob/master/src/c...

I wish the Nim team worked on an easier onboarding process to the language. The documentation is in depth but it just throws everything it has at you all at once with no progression.


I use exercism.io to pick up syntax, I'm finding this to be helpful for nim.


> When I started learning Nim I noticed the docs and even the getting started tutorial just threw the kitchen sink at you, it totally overwhelmed me.

This sums my experience as well. But nowadays there are more resources to learn Nim. The Nim in action book is a few years old but still holds. Nim basics is a free tutorial for beginners https://narimiran.github.io/nim-basics/ When did you start learning Nim?


Yes, if you search on the web or YouTube there are surprisingly few tutorials for Nim. Compare that to Julia which has a couple of books published, and also has courses available on Udemy, LinkedIn Learning and YouTube. Some of these Julia courses are pre-1.0 Julia while Nim doesn't even have a course post 1.0 release which is a shame. Let's hope that changes soon - more learning resources can only help interest in Nim grow.


Nim has a book. The one Dom made. Nim in Action or whatever.


I debated whether to start learning Rust vs Nim... Rust does seem to have more libraries to get stuff done.


I find that with JavaScript, Rust, and Python, my tool belt is pretty well equipped these days.


Also consider julia.


I would say that you need to present a case where Julia would be needed. I do Python and Javascript it covers most stuff that isn't low level. Rust does that, so what is the need for Julia?


Numerics and ml.


Python handles that pretty well. Maybe Julia does it marginally better, but I don't think it's worth learning a whole new language for this unless your day job is only 'Numerics and ml'


This is only true because other people wrote the hot parts in other languages and exposed a python library. That’s fine but isn’t really fair in a battle of languages


I think it's fair. We're not talking about that intrinsic language qualities, we're talking about what you can do. Or that's my take anyway. That's what I think of comparing when the lead line for the whole conversation is "tool belt".


The two are related. If i can't write fast code in the langauge itself but have to resort to another langauge, then that's very much a "what can you do with it" problem.


And I'm saying doing stuff with numpy counts as doing stuff in python. Just because numpy is mostly implemented in c doesn't prevent me from writing ML in python.


I understood. What if you need a feature that is not implemented in C and exposed? You need to write C.


I agree with you there. I just haven't hit that need yet. I suspect most people don't.


I suspect many people do but just don't (yet) realise that there's an alternative.


Same issue with Julia, no?


Julia is much more amenable to implemening it though in native Julia, so I think it's a lot easier to stay out of C even when there's a similar library hole between Julia and Python.

However, it's just a wall I haven't really bumped into. I suspect many people don't, and a part of that is popularity - so much exposure can do a lot to expose the warts. Though I think that goes for both Julia and Python.

But I think once you know numpy well you probably won't run into that restriction (can extend in the native python) unless you're an expert (for the ML usecase).


Good c interoperability is a benefit that Julia and Python share, and exploit for performance gains. They're on even footing here.

Rust does it too, calling external unsafe code so they don't need to rewrite the world all in one go. I'd say that the extendability of a language isn't just fair, but a crucial aspect in comparing languages because that's what people do in the real world.


No, interop is so much simpler in julia. Not to mention cpp interop.


Yeah, won't argue there; I was responding to the notion that python is somehow cheating by using numpy. Python has at least 3 paths to cpp interoperability now, but they've all got their warts.


And they are require you to write wrappers. Peep the julia interop https://github.com/JuliaInterop/Cxx.jl you just import your cpp code and it works.


This has been broken since v1.4 at least. You might want to peep CxxWrap


They did that because Python has an appealing user experience for people using that kind of code day-to-day. IMO this is what Julia missed; I've tried it at least once per year since first hearing about it in 2014 and the UX was horrible every time. Pretty sure if I ever found myself needing to write low-level numerical code I'd just use Fortran like everyone else.

It's a real shame because Julia initially felt like it might develop to be the natural successor to Python in scientific computing, but over time that has seemed less and less likely.


When was the last time you tried and what problems did you encounter?


Last year sometime - I'm about due another look.

Main problems were cryptic errors in the REPL, super slow start-up and first run times (JIT issue I seem to recall reading was being improved), and a generally very poor third-party package management system.

On the last issue, I'd say Python is not perfect but pip and various virtualenv management systems are "good enough". Having something like the Rust/Cargo setup is what I would prefer.


It has gotten a lot better I think.


Looking at it that way, every language is just a wrapper around assembler.


From this perspective, I totally agree. If you enjoy learning new languages (at a surface level understanding) for the increase in perspective, you might find Julia worth learning.


It doesn't just do it 'marginally better' but you don't seem to have a need for it just now.


How much julia have you written?


It's not marginal, not even close.


I disagree, and judging by relative popularity so do most people. I guess the onus is on you to prove that it's not even close.


Popularity and quality are not really correlated, so the first point is false. As for the second, nobody has to prove anything to you - if you're interested go take a look and you will see for yourself, otherwise don't and keep using Python.


> Popularity and quality are not really correlated, so the first point is false

Popularity is a good enough proxy for how good a given tool is at a given task. Go give 100 people a hammer and a tootbrush and ask them to nail a plank, and you can divine the relative usefulness of each tool based on the number of people who used each.

> As for the second, nobody has to prove anything to you

Nobody is beholden to anyone, but if you're putting a contrary position forward with no evidence be prepared to be asked to elaborate. Such is life.


The comparison ignores things like age (older tools will be ore popular). Certain things have to be experienced. I'm not sure how you want me to prove something like productivity. How would you prove python is better for some things than c.


Your analogy is false. Python is popular because it is popular - it has momentum. That's pretty much the only reason.


The reason for pythons popularity is only because it’s popular? Not because it’s suited to any particular task?

Then no doubt the majority of those 100 people would use the toothbrush to hammer the nail. It is after all a far more popular tool.


Again, that comparison makes no sense. Python became somewhat popular for a reason and that built momentum but it didn't get to the current popularity because it's a superior language for any task - rather because many people use it and write tutorials, answer questions on SO, and write libraries and tools.


Python had a number of similar competitors a few years back, the main one being Perl. Python is better than Perl for certain tasks, the syntax encourages writing more readable code. Personally I like Perl, but if I am doing anything non trivial, Python will give me more maintainable code. Perl has a more expressive syntax, which is both a blessing and a curse.

Then you have PHP, which is (or at least was) an inferior langauge to Python when python started getting popular. Python does more that just web dev. Likewise with Ruby.

Python could easily be described as "better" depending on the task.


I don't care about proving anything. You are looking at absolute numbers as opposed to things like growth or experience of people who switched.

Even something like inverting a random 4x4 matrix is just so much simpler. In julia it's just "inv(rand(4,4))". In python, you'll have to install numpy first and then writing it out is like 4 times as many characters.


In Python it's also literally "inv(rand(4,4))".

And yeah, you need to install numpy first. But if you're doing this work you of course already got it installed.


No, it's

import numpy as np np.linalg.inv(np.random.rand(4,4))

And you have to write it like this, because of lack of dynamic dispatch.


> import numpy as np

> np.linalg.inv(np.random.rand(4,4))

Which of course is nothing to do with python, and is trivially reduced to the aformentioned `inv(rand(4, 4))` with imports.

> And you have to write it like this, because of lack of dynamic dispatch.

Not at all.


But you shouldn't be doing the imports and no one does it like it. You shouldn't import it into the global namespace because you can't dispatch on the type so you need to refer to it by the full name.


Well, not really, no. You shouldn't import into the global namespace for clarity - "from numpy import *" is very unclear. But importing specific objects is fine and is done all the time. And if you're in a notebook setting and value "less typing" over everything else it's certainly encouraged.

Dynamic dispatch would help if you also defined a "rand" function in scope, but that's a way more general argument and IMO you'd lose more than you gain with it.


OK but the imports don't same much typing.


If Nim had sklearn, pandas I would probably choose it over Julia. I would love a statically typed Python. Have no interest whatsoever in learning a brand new ecosystem and syntax at this point.


Julia has crazy interop so you can call python libraries if you want. You are missing out though.


I had the exact same debate. One potential strength for my use case (cross platform mobile stuff) is that Nim actually transpiles to Objective C, which can then be compiled with Xcode. That’s a big deal because you need to provide bitcode with your binaries on AppleTV and Watch. Rust can do it in a hacky way but there’s no guarantee it’ll keep working.

I still went with Rust in the end, though. Nim’s documentation was rough going and I hit a lot of walls. The initial learning curve for Rust was way higher (borrow checker and all that) but once I got up to speed it was a great experience.


Afaik with Nim you have access to the C and C++ libraries.


?

With both Rust and Python you have "access" to C FFI. (Not the C++ out of the box but there's some projects out there that make it work)


Nim compiles to C, C++, and other languages, so it's not using FFI. You're using C and C++ libraries natively.


NimConf 2020 [1] also took place just recently, was great success.

[1]: https://conf.nim-lang.org/

Hacker news coverage: https://news.ycombinator.com/item?id=23585006


RosettaCode comparison between Rust and Nim: https://rosetta.alhur.es/compare/Rust/Nim/#


Another statically typed language that "looks like Python": http://strlen.com/lobster/


If you like Nim but you are looking for something that can be practical for corporate use cases where adopting a new language with a young ecosystem is extremely hard, consider Cython.

Cython is a superset of Python that allows super easily compiling custom Python extension modules directly in C or C++, interoperates easily with Python packaging, and is very battle tested.

It allows you to “just write for loops in native Python” and get the statically typed & compiled optimizations you’d get in C or C++ for free. It also supports fused types for multiple dispatch of compiled functions.

I’ve used it in production workflows at many companies. It’s very easy to use.

The best part is that you can apply it gradually, only bothering to implement small sections of code as Cython extension modules when there’s real proof from a profiler that the code is a bottleneck for reasons addressable with static typing. You don’t have to live with the huge premature optimization baked in as a premise to languages that enforce this for all code. “Static typing when _you_ need it.”

https://cython.org/


Cython is excellent. The Numba [1] and NumPy combination is also strong.

> Numba translates Python functions to optimized machine code at runtime using the industry-standard LLVM compiler library.

[1] http://numba.pydata.org/


Great points. The bits of cython I've used are solid.


A very informative interview with Araq (the creator of Nim) was posted a few months ago: https://www.youtube.com/watch?v=-9SGIB946lw It's mostly up-to-date.


Another language, which is a possible alternative to Nim, is Slider, available at http://mrsename.blogspot.com/


What concurrency support does Nim have?

Anything like coroutines, async-await etc? Or just OS threads?


All of the above. Though the coroutine library looks out of date. Most network libraries use async. Threads are possible, but weave looks promising: https://github.com/mratsim/weave


I am just surprised this hasn't been on HN before.


Nim has been mentioned on here many times.



then why is this on the front page?


It often gets posted and ends up on the home page because many developers are very impressed by Nim and love it.


This is arguably a dupe, since Nim has been discussed on here in the last 12 months.


we ran out of new things


To be fair, I discovered it on exercism.io and am thoroughly enjoying learning it.


What are your thoughts on exercism, if you don’t mind me asking? I personally liked it when I dove into Go and Rust and was searching for practice material.


Absolutely love it to learn syntax. Work through the logic once in language of choice, then apply to several languages to learn its syntax. If there is time, go through code review to get antipatterns worked out. Also fun to mentor, I would reckon.


It has.


Nim is just right down ugly and borrowed the worst from Pascal and Python. The multi-paradigm concept is a mess. Tried it a couple of years ago and moved on to better, more simple tools.


Lol. The diversity of worldviews never ceases to amaze me.

What might those "pretty" "simpler tools" be? Perl? Heh.


Respectfully, I disagree.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: