Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

An ML that competes with rust+tokio is an exact description of haskell. It's had the best multicore runtime and concurrency primitives of any general purpose language for quite some time. No need to wait.


My impression is that Haskell's laziness and focus on purity though make it rather a pain to work with, whereas OCaml is a lot more pragmatic like Rust.


That's why OCaml won over Haskell and Standard ML for me. The authors are not afraid to “get their hands dirty” with procedural constructs or even OOP for the cases where those things are useful.


This impression does not match reality. It's great to work with. I'd certainly take it over ocaml (having spent probably thousands of hours on both).


Can you give a few examples about why did you pick Haskell over OCaml, please? And let's skip "elegance" and other pie-in-the-sky magic.

Let's talk productivity in commercial projects. Or making scripts for your own use (if you're not happy with bash/zsh/fish... which I'm not).


> An ML that competes with rust+tokio is an exact description of haskell.

Haskell only exceeds rust+tokio in terms of overall code elegance and succinctness.

When it comes to performance, rust+tokio is going to be much more performant by default as there will be no GC, no hidden traps with laziness, no subtle memory leaks etc.

Sure, a Haskell wizard will be able to reduce the gap further and solve issues but it is going to be difficult to consistently beat rust+tokio. By the time said Haskell wizard makes changes to the code, the famed elegance of Haskell is going suffer as the code will be littered with strange incantations of strictness annotations, possibly some C-ffi and other magic. This will not be the beautiful Haskell that you learn in textbooks. Only very few people know how to make Haskell truly fly. If you're one of them, then you're lucky!


> When it comes to performance, rust+tokio is going to be much more performant

This is nonsense without significant further qualification.

I guarantee the tokio scheduler and async stack model gives you net worse results for many real workloads.


While quoting me you omitted the phrase “by default”. I said “When it comes to performance, rust+tokio is going to be much more performant by default”.

What I meant was in the typical case, you can expect rust+tokio to be faster than Haskell. This is not a surprise as rust is much more low level, does not have a GC and its compiled output maps better to modern processors rather than the pure functional lazy style of Haskell.

Plenty of benchmarks across a variety of workloads and program confirm that Haskell is slower than rust on average. For webserver type use cases (which uses a lot of Async) Rust is faster. Check out tech empower benchmarks for example.


Including work-stealing scheduler and runtime? Ability to just spawn 50_000 tasks -- regardless of whether they're CPU-intensive or I/O bound -- and have the runtime handle it speedily?

I've heard Haskell praise a good amount of times but at the same time many people also said that doing actual work with it has more friction than it should, so I don't know.


I don't know if those things you ask are in Haskell specifically, but what I do know is that Haskell concurrency is pretty much best in class. You get channels (Control.Concurrent.Chan is in the stdlib), STM, green threads (forkIO). You can also get stuff like async (from the async library) or streamly (even higher level than traditional async). I'm not 100% sure if you can push insane speeds with it (even though Haskell is really fast, it's just very tricky to optimize correctly) like you can with Rust, but the developer experience on concurrency is just through the roof, imho (dare I say even better than rust). STM is just the best kind of magic.


Yes. Go spawn a million tasks in haskell. It will work fine. 50k is child's play for GHC runtime. Only serious competitor is BEAM.


Yep, that's what I was getting at: Erlang/Elixir do this effortlessly. I'd love for the popular system languages to have the BEAM runtime.

Well, based on your comment, I might reevaluate Haskell. Last time I was severely put off by lack of good tooling (but I did hear cabal was improving) and a fragmented ecosystem. Maybe things have changed.


both cabal and stack have improved, there's now a pretty good language server in HLS, there's ghcup to download and install both GHC and tooling and manage it (both on linux and on windows). And for quick one-off experiments I usually use nix (I don't fully use it, too complex for my brain, but being able to do

    nix-shell -p 'haskell.packages.ghc{version).ghcWithPackages (pkgs: with pkgs; [any number of packages here])'
is just an insane superpower that lets me experiment stuff in ways that most other ecosystems would dream of)


You're probably aware of this but not many share the love for Nix. I tried it and got put off by the unnecessarily alien syntax. They honestly didn't need to invent their own language, there are plenty out there that are pretty close to what they are aiming at.

But it's not only that. It's a general problem of a high initial learning curve.

Modern tool inventors really have to finally learn that everyone is super busy these days. Make it brain-dead quick to learn or your tool will forever remain a niche curiosity for hobbyists.


I totally get that sentiment about nix, and actually I sort of share a variation of it too: I really want to learn it, and I like the featureset in theory, but I can't get used to it in practice, and I don't have the time to invest myself into something that difficult to get going.

The reason I brought up the nix command is that I only use nix, for haskell development, for that specific command: I found it once in a blog post, saved it, then put it under a function into my bashrc, and I use nix quite literally for only that purpose. I've done a lot of development on various functional languages (with a dayjob in F# that lasted 3 years) and being able to quickly experiment with libraries was something that I sorely missed when doing repl experimentation in those languages (I think F# recently got a #nuget directive, but that was after I stopped using it).


Curious and interesting.

Not to be the party pooper: didn't Docker work well for your Haskell use-case?


I personally haven't had much chance to use docker with Haskell specifically, but I imagine that it would definitely work, it would just take a bit more work to get going, so I just take the option that is less friction (at least for my specific usecases). Nix is also somewhat popular in the haskell community - especially when it comes to GHCJS (with tools like obelisk making it easier to develop cross-platform apps using web tech). So I just go along with the flow of the community, personally, especially considering that I use Haskell exclusively for my own personal stuff (sadly).


Yes, as they say back home, "when you are going I am coming back".

"Parallel and Concurrent Programming in Haskell" (2013)

https://www.oreilly.com/library/view/parallel-and-concurrent...


I've heard that many times, still skeptical after trying Haskell thrice.

Have Haskell's ergonomics improved?

I feel your snark is unjustified. You might be putting people in two extremes: wise elders and hip kids. There's a huge amount of people in-between however.

(Also, the saying actually is: "You're going to where I am coming back from".)

Find me something like Erlang/Elixir with the speed of Rust and I won't learn another programming language ever again.

No? Then the search continues.

So far Haskell hasn't impressed. In all honesty Rust is quite hard to put in that niche as well since its `async` stuff is extremely annoying and hard to get right but I guess the tries are still ongoing. OCaml is progressing but who knows when will they get there.


You might find an answer at Facebook infrastructure....

https://engineering.fb.com/2015/06/26/security/fighting-spam...


I'll go through that, thanks. But so far it still seems that Haskell is an acquired taste and that's a shame. You have to overcome a number of idiosyncrasies to get productive.

As a guy who went through at least 8 languages and 30+ frameworks, it gets tiring. I want something that ticks most boxes from the get go.


> Find me something like Erlang/Elixir with the speed of Rust and I won't learn another programming language ever again.

You're describing haskell again.

You haven't articulated why you have disliked haskell in the past.


Mostly the seemingly big initial learning curve; you have to get extensively onboarded in "the Haskell way" (monads are not hard to get but the community is hell-bent on avoiding wording that makes it easier to grok; why?).

That could be okay for many but as I get older, I tend to take people/organizations that require big upfront investment less seriously.

Example: one of the things Elixir has won me over with were its bite-sized introductions and practices. You can be a 3/10 Elixir dev and you can be a 8/10 one, and that's mostly depending on how many of the official tutorials you've covered. The road is mostly a straight line to an acceptable level of proficiency at the end of which you can start choosing to specialize.

Rust, OCaml, Haskell -- they all failed that test for me.

I picked Rust mostly because of the no-GC situation and because of `cargo`. Many older programmers handwave away the importance of good tooling and this is where they lose a lot of potential mind share that can rejuvenate their languages / ecosystems.

Example on this: OCaml's tooling. A lot of people in this ecosystem always degrade the importance of a good task runner + builder + project manager. I spent half a weekend learning `esy` once and mostly tamed it by making it imitate mix/cargo but it wasn't trivial. The end result is a build script that does 80% of what mix/cargo do. The exercise made me scratch my head wondering why what I did back then isn't upstreamed and made official and why is everyone happy to pretend that building an OCaml project is a solved problem when it (very!) clearly isn't.

Haskell's cabal didn't fare better last I tried it -- admittedly that was more than a year ago.

If Haskell has good bite-sized lessons that lead to an actual real job productivity (less academic exercises, please!) then I'd be happy to give it a fair try and maybe make it a part of my toolbox.


I use Elixir at my company and Ive gone thru three phases: 1.: love it for the polished ecosystem and functional programming, get frustrated because even though it is functional it is not at all like writing ML-style code and not having types, and now I am starting to grok BEAM and OTP.

Specifically, I realized how many problems we solved using OTP that would be much more challenging to get right otherwise. We can use processes to get transactional behavior, spawn workers very simply.

We use event sourcing.

Our state snapshots are just a process that receives events. It was easy to evict snapshots by killing processes idle for too long. Beautiful!


Yep, all true, I love it myself. But nowadays it gets harder for me to love it due to the lack of strong static typing. :(

Test coverage becomes mostly wishful thinking. And it's extremely easy to do non-exhaustive pattern-matching which is something that just kills me.

It's absolutely true that Elixir is mega-productive though. And for a ton of projects out there it's good enough and more.


You've done your fair share of trying!

Your assessment reflects mine entirely. Elixir has that charm in that it leads you to productivity quickly, just as Go does. Someone could argue that's only because of much larger pool of users that's paved the path before.

Picking up Haskell again for fun, and the Effective Haskell book has been a fun learning experience. Not too beginner-ish, and doesn't take too much time to explain concepts.

I'll have to try esy and dune again sometime when OCaml 5's stable. Their commands are just different enough from go/cargo/npm to be annoying.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: