im so tired of non-substance rust content on hn. i dont want to know THAT it is written in rust (shocking, i know!) I want to know WHAT made it better to be in rust rather than x. I'd read the shit out of an article about that. its starting to become a meme cheatcode, just add "in rust!" to the end of your post title and you'll get a baseline starter pack of hn votes
The constant Rust spam on the front page makes me think that there may be some sort of pro-rust bot that upvotes them. There probably isn't, just a lot of enthusiasm, but it's tiring.
However, it has also convinced me that Rust isn't faster than C++ in practical applications, thanks to the number of "we moved our C++ codebase to Rust" blog posts that have 0 benchmarks. You can practically guarantee that if their code had gotten faster, they would have said something about it. That has dulled my enthusiasm for the language significantly.
Also, it is humorous to me to see blog posts that say, "[Rust feature that causes inconvenience] enables compilers to do [compiler optimization that C++ compilers already do]."
> The constant Rust spam on the front page makes me think that there may be some sort of pro-rust bot that upvotes them. There probably isn't, just a lot of enthusiasm, but it's tiring.
I get it; we all have our annoyances. Personally, the above kind of comment is tiring. Complaints about what makes the HN front page seem like a waste of energy. How can we contribute to moving the conversation forward? By "forward" I mean learning from each other, sharing ideas, and other constructive goals.
P.S. If you don't want to see [keyword] in HN, there are probably browser extensions that can help you.
The problem is that I am interested in Rust's development and I do want to see the Rust keyword, when the content is substantive. For example, there is another front page post right now about Rust and Haskell, which is an interesting commentary.
>>> pclmulqdq: The constant Rust spam on the front page makes me think that there may be some sort of pro-rust bot that upvotes them. There probably isn't, just a lot of enthusiasm, but it's tiring.
>> xpe: ... Personally, the above kind of comment is tiring. Complaints about what makes the HN front page seem like a waste of energy. ...
> 0xdeadbeefbabe: The complaint is about the input not the output.
As far as I can tell, the input is the low-effort "I like Rust but have no insight" blogs. The output is their appearance on the front page of this website.
>Also, it is humorous to me to see blog posts that say, "[Rust feature that causes inconvenience] enables compilers to do [compiler optimization that C++ compilers already do]."
What? You can't safely do strict aliasing in C++ and even if you did, it was broken anyway to the point where the Rust compiler had to disable it because of LLVM bugs.
The restrict keyword historically has meant "I know these pointers are different, please optimize as though they are." It had nothing to do with correctness. It is quite tricky to check for correctness in the general case of this kind of keyword.
The constant complaining about Rust posts may suggest that opposite is true. Why don't you follow other topic if you are not interested in Rust. You are free.
Why don't you put your interest and desires together and make something like that? I'm sure the people who are at least trying to do something would welcome the example.
The funny thing is that posts like these are fine and upvoted heavily. But if someone make a post like: Tiobe states that "C++ is language of the year 2022", then its either flagged or utterly downvoted to death.
The Rust fanatics have VERY thin skin if the language they hate gets any upward-adoption news.
I wouldn’t care if Rust were 10x slower. Just within my career, computers have become 1000x faster. We can definitely afford a stack that might actually work.
I wrote a small service to synch the datetime based on a gps/imu module with an ethernet interface. It needed to be rock solid because it was going to fly in the field (scientific instrument, not avionics). It was faster and easier to write in Rust, despite having no one else with Rust know-how in the company, the rest of the system components in C++/Python, and several months of of c and c++ experience myself. I barely knew rust and yet felt way more confident the program didn't have weird unexpected edge cases, leaks, etc. Not to mention way more ergonomic.
My impression as a non-expert who has only dabbled in Rust is that while it is ‘easy’ by design to write memory-safe code, it is relatively difficult to write panic-free code, and by and large the ecosystem doesn't try. (I know about `no_panic`.) In some domains (e.g. the rust-analyzer panic discussed here yesterday¹) this is fine; in others “this is fine”.²
The first Ariane 5 launch is an example of the latter. The code was written in a ‘safe’ language, Ada. A conversion overflow (that, as it happened, was in fact genuinely harmless) triggered a panic that (by design) brought down the subsystem, and with it the rocket.³
I allow for panics when the panic is expected to never execute, i.e. logic errors.
I see how it can be problematic, and thought of adding a general logic error for such cases. However, it adds work for devs and the machine.
Still, panics can happen for allocations, where I don't know a solution.
Would you consider this as a problem as well?
If not, no_panic should suffice, right? Yeah, it might make things harder, but for rockets you probably won't use that many external crates and a bit more work should be acceptable, shouldn't it?
What would you like to see the devs and the lang team to do?
I'm not knowledgeable enough about Rust's actual current state and plans to make concrete suggestions. Given the wide interest in Rust for security, I do think it's likely that in the long run significant effort will go into avoiding DOS-by-panic.
I think it's unfortunate that LLVM had no popular small targets during Rust's early years, so it didn't get much attention from embedded devlopers. Heap allocation is one place where this shows. It's not really worse than C++, though, where in principle many parts of the standard library can use a custom allocator, but handling errors is awkward, and in practice the ecosystem doesn't try. C and C++ standards also don't generally specify which library components are allowed to perform heap allocations internally.
C and C++ have ended up with small parallel ecosystems for such use cases. Since Rust aims to have one true ecosystem under cargo, it might be nice if library components could be explicitly tagged as panic-free, heap-free, etc.
Rust doesn't exactly have things tagged as heap-free, but no_std category[1] contains many crates who have at least indicated that they don't use the standard library. Some may pull in liballoc and still want to do allocation, but it's better than nothing. Additionally, many packages that normally require the standard library let you opt out of that, with reduced functionality.
My team works in no-heap Rust. We use external packages. It's much easier than in any other ecosystem to do so. Even that bit above goes a very long way and saves a lot of work.
Comments like this make me think that you have never worked in a domain where safety is paramount (like avionics or medical technology).
People who are developing in these domains are typically using a strict subset of c or c++ and running their programs through a process of formal verification.
As far as I am aware, rust has no specification to formally verify, and the formal verification tools are all out of date (because of the rapid pace of language changes) or lacking (for example Miri lacks the ability to consider all input values because it executes your code the same way as it would normally run)
The wealth of formal verification tools in the safety critical c and c++ world is very high. Those tools, not the compilers are what get validated during tool qualification. Ultimately it is about the formal verification, static analysis and dynamic analysis tools, as well as, traceable code execution, robustness tests etc.
A formally verified rust compiler could really be a game changer, but until then, the tools do exist to write safe c and c++, but they are expensive and adherence is low outside of the places they must be used
Ferrous Systems is “on the last legs” of qualifying the compiler for a few different safety standards. It is true that Rust is behind in these domains, but it’s only a matter of time.
Most people aren't. I understand the hype on one level. It's fun to do great work and build high quality things. OTOH, What most of the world needs is "good enough" work rather than great work.
You can just toss out names of qualities and it might impress someone with no experience but to the rest of us we're just waving our hands like "ok.. and then?" waiting for the substance, which this comment provides none of
I'm not sure I agree. If anything someone with experience has first-hand knowledge of the perils of buffer overflows, double frees, and other memory bugs. Sure we can go into more detail but it's a little disingenuous to act as if someone with experience writing C would not be able to fill in the blanks.
>someone with experience writing C would not be able to fill in the blanks.
"yeah well if you're smart you wouldn't need explanation"
except the people you're swaying with such remarks aren't the ones who already know, its the ones who are afraid to ask lest they be judged so they just kinda laugh along and parrot the smartest-bully.
I have never worked with Rust but seeing major operating systems and critical softwares gradually adopting, and other developers reporting less segfaults when replacing their codebase with Rust is convincing enough for me to try it out.
I still think this is one area Rust missed the mark. Go concurrency is quite amazing. Its fast and easy to reason about. Every language can learn something by how goroutines work here. Most importantly, it feels like you're just interacting with synchronous code. There is no "what color is your function?"[0] problem.
The fundamental primitives Rust decided to adopt are just not ergonomic by comparison, and suffers from the aforementioned "What Color Is Your Function?" problem.
An otherwise great language decided to adopt an old hand grenade in regards to async primitives
This is an absolutely reasonable take. The thing is, this is just a divergence in values. Rust isn’t willing to accept the overheads of that style of concurrency, and is willing to trade off on the “harder to learn but with less overhead” side of this spectrum.
There's an interesting dynamic around asynchronous code in that the way to do it the fastest is to have stackless coroutines which operate on state objects on a batched fashion. This limits what you can do, but completely eliminates a lot of the overheads that are associated with async.
The inclusion of generic async semantics at the language layer is itself a compromise. Rust may have gone a little too far down the "reduce overheads" rabbit hole given that this is the case. Go definitely didn't go far enough for high-performance use cases.
i may not have read the article correctly, but i see no actual evidence there that it is the "world's fastest growing language", whatever that might mean, except that there a lot of questions about it on SO (and of course posts like this on HN).
also, possibly related, isn't it strange you see almost no more haskell posts here these days?
the evidence is the same of the elections: if only 30% of the people vote, the most loved vote was non voting.
I don't see PHP programmers going to Stackoverflow to vote for PHP, but it still gets a lot more things done than Rust, for sure.
And not because they don't like the language, but because "normies" don't go out talking about their love for the hammer or their drill or their kitchen too... no wait, chefs talk about their knives constantly!
Sure, but that doesn't mean there isn't any evidence, as you claimed. A consistent result, year-over-year, in a large and long-running survey is definitely evidence. Not proof, but still evidence.
hn goes through phases; ocaml, haskell, go, react, ive been here almost 2 decades now and it's always been this way, but you used to be able to disagree and talk about it without being shunned by devotees. right now you're not even saying much negative but you're almost so grey you're effectively silenced.
Nim, while providing the same memory safety as Rust, doesn't do so without overhead. So Nim very expressly sacrifices performance for safety.
In rust, for example, it's possible to safely share data between threads without incurring costs of a garbage collector. Nim uses a different GC for each thread, so it naturally increases costs on a per-thread basis.
I was sort of vaguely aware of this but thank you for pointing it out! I think the point still stands, and that reference counting like this is just a different kind of garbage collection.
It certainly adds (some, not many thanks to static analysis) runtime checks, but it's also fairly distinguished from typical tracing garbage collection by being deterministic. I believe ARC/ORC also works with a shared heap via "isolated" data.
It's been a few years since I've used Nim, but outside of the language itself, the tooling around Rust is absolutely incredible. Rust Analyzer, Clippy, Cargo, etc.
I can install Rust, for one thing (I’m on an M1 mac; trying the three main nim installers, one gives me an out of date version, one gives me the x86 version, and one segfaults)
Also Rust’s error messages and standard library documentation are way more helpful
Also Rust runs nearly twice as fast
(I do like the nim language, but no more or less than I like the rust language; tooling makes a big difference though)
For me (and inv2004) Nim, runs faster than Rust on your RosettaBoy project.. But perf usually depends on an awful lot of territory that is routinely unexplored before conclusions are bandied about.
The Nim installation & tooling experience unfortunately can be rough. OTOH, it has like <1% the funding/exposure. Also, I find the "100s of packages dependencies" for any non-trivial Rust program to be pretty awful, as well as reading other people's Rust code (a point made more salient by all the deps).
Anyway, there are many dimensions/axes by which to evaluate most things in life and usually no objective way to weight/combine them into a single overall score. So, it's often good to keep an open mind. (Not saying you don't do so personally!)
I think Rust is probably great for certain types of projects that need effecient programs but the main problem I have always had is not having enough time and Rust does the reverse for me in that regard.
It is hard to learn and takes quite some time to write compared to many other popular languages. For what benefit? Most languages are fast enough and the bugs I encounter is not bugs that Rust solves. I do web dev stuff, like I would assume most of us do, and the performance bottlenecks are pretty much never due to the language being too slow.
Granted, I understand completely why large companies that work on complex systems have a use case for Rust, the main issue I have with the community is that it tries to sell itself as a solution to everything when in reality the most likely scenario for most of us is that we would be more productive and successful in basically any other language.
Rust has a nice implementation of discriminated unions/algebraic data types, which makes it easy to write the business logic with composability over inheritance. Which lang does that well with also pattern matching, non-nullability, explicit mutability etc.? Maybe OCalm? F#? Elm?
F#, Scala, Purescript, Haskell and many others are much much better in this regards than Rust. They are not replacing Rust of course because they don't focus on performance/lowlevel, but in the context of OPs question, while Rust is great, it still has a lot to catch up here.
Of course! What I meant was that in some cases composition is a more flexible approach to representing a complex (business) model than inheritance, from object oriented programming.
Rust has a nice (IMO, ergonomic) way of representing composition through a language feature called "algebraic data types", although it's sometimes called "discriminated unions" or "sum types".
Sum types combined with "pattern matching" is a particularly powerful combination that can make code very easy to write and read, IMO. But not always, so don't take it as an absolute truth. This is one of many articles that explain sum types in Haskell: https://medium.com/@willkurt/why-sum-types-matter-in-haskell...
Statically-typed languages that borrow from functional concepts like Rust, Haskell, F#, Elm and others usually support that. C# has been debating adding support for it (https://github.com/dotnet/csharplang/issues/113), but its community hasn't reached consensus yet. Dynamically-typed languages like Elixir don't benefit the same way from this without sacrificing the flexibility of the language, but in this case they're considering alternative approaches like set-theoretic types: https://elixir-lang.org/blog/2022/10/05/my-future-with-elixi....
Same guy posts a lot of information about the topics on his website[1] as well. For anyone who isn't anti dotNet for various reasons such as being from Microsoft I recommend giving F# a try, it is a very nice language.
I am not sure how F# which started as a .NET version of early OCaml 3 can be seen as a modernized OCaml 5? Even more so when F# has to give up on some OCaml core feature in the transition (before growing in its own language of course).
Yeah OCaml 5 seems like a different beast with the new concurrency system plus the new effects system etc. If they get a good windows install experience I want to try it out at some point but seems it is recommended currently to go through WSL or msys/cygwin (I forget which) and at this moment I'm not motivated enough to go through the hoops.
You have a shape which can be either a circle which then has a radius, or a rectangle which has width/height. The shape cannot be anything else, this is important and something you want to rely on and express in code. In C++ you can't easily do this on the language level. In Rust (albeit in a somewhat limited way) and functional languages, you can, via enums where variants can host other types.
You’re not stupid: the OP wrote a bunch of jargon that more or less nobody in the industry uses on any regular basis. It’s a similar pattern to functional programming evangelists’ rhetoric.
The short answer is this: the compiler enforces type-checked unions (called `enum` in Rust) guaranteed to hold one specific type, and requires any `match` block (cf `switch`) checking a value to exhaustively check all possible ones. That’s not the whole story, but it covers about 99.99% of the use cases the OP refers to.
those language features are nice but they are not solving important problems and they do not justify the cost of learning and using rust for most people
> which makes it easy to write the business logic with composability over inheritance
One feature Rust is really missing is trait "inheritance" – you can't say "dispatch this trait implementation to this field" or anything like it. (The closest you can get is Deref, but that has its own quirks.)
These days I code primarily in C++, which is what Rust seemingly is designed to replace. It doesn't, as far as I can tell. I optimize directly for the hardware I'm running on, which typically gives me 10-100x performance improvements. Controlling how memory is managed is critical.
It's great when something new and better comes along if that's really the case, but I've seen the "incrementally better in certain use cases" play many times before.
This can be done with Rust as well, using unsafe code block if absolutely necessary. And only those blocks have to be reviewed for footguns, as opposed to your entire C++ code.
Also even a 10x speedup sounds much. In Rust, instead using .get_unchecked() (the C++ equivalent) instead of using .get() is typically only a neglegible improvement of a few percentage points in that particular loop or whatever.
> I optimize directly for the hardware I'm running on, which typically gives me 10-100x performance improvements. Controlling how memory is managed is critical.
What makes you think you can't control how memory is managed in Rust? Rust doesn't have "automatic" memory management, it has a compiler that can help ensure you are managing memory correctly, and force you to type "unsafe" when you are doing things it doesn't understand.
> it has a compiler that can help ensure you are managing memory correctly, and force you to type "unsafe" when you are doing things it doesn't understand.
Well it's a bit more subtle than that if we're honest.
Arguably, Rust does make a number of memory layouts (self referential structs, per struct allocators, non thread local addresses, etc) much harder to accomplish than "typing unsafe".
If you self-reference using pointers and guarantee the struct will never move, you don't even need unsafe. If you self-reference using offsets from the struct's base pointer, you need a splash of unsafe but your struct can be freely moved without invalidating its self-referential "pointers".
> If you self-reference using pointers and guarantee the struct will never move, you don't even need unsafe
I have a hard time seeing how you could use self references without a combination of raw pointers, pins/projects, and unsafe code. The tediousness of doing so is pretty much a no-go for any sane developer.
The only sane solution seems to be a generous sprinkle of Arcs, which is _not_ okay in high performance scenarios.
> Per-struct allocators are a work in progress
Yes, and most of us don't really want to use nightly in production. It's been years of work on the allocator work group already, and there's probably still years to wait before a stable release.
> Not sure what "non thread local addresses" means, but in my experience Rust is pretty good at sending data between threads
Well I mean overall any way to allow a somewhat eased setup for storing and retrieving objects to/from shared memory, across processes that do not map said memory at the same location. This is very very annoying to implement in Rust at the moment.
Don't get me wrong though, I think Rust has a lot to offer. But when you dive in even slightly technical subjects in Rust, it soon becomes obvious that the power of C/C++ is far from "just type unsafe" away.
> I have a hard time seeing how you could use self references without a combination of raw pointers, pins/projects, and unsafe code. The tediousness of doing so is pretty much a no-go for any sane developer.
You just do it. I can't recall a good example at the moment, so I've just thrown together a load of Cells: the thing I was doing when I learnt this technique didn't have any Cells in it.
> Yes, and most of us don't really want to use nightly in production.
Strictly speaking, this is only needed if you want to use Rust's standard library types with custom allocators. You've been able to have per-struct allocators for your own types since long before I learnt the language.
> Well I mean overall any way to allow a somewhat eased setup for storing and retrieving objects to/from shared memory, across processes that do not map said memory at the same location.
Can't you just store offsets into the memory region, and PhantomData references, then unsafely index the mmap'd region when you need an actual reference? Seems like the same thing you'd do in C, except the function you abstract that with can be a method instead. (Unless I'm still misunderstanding.)
> I optimize directly for the hardware I'm running on, which typically gives me 10-100x performance improvements. Controlling how memory is managed is critical.
> What makes you think you can't control how memory is managed in Rust?
> Arguably, Rust does make a number of memory layouts (self referential structs, per struct allocators, non thread local addresses, etc) much harder to accomplish than "typing unsafe".
So basically, the right question would be "can you explain what you mean that Rust can't control how memory is managed"?
Because the author knew, the Rust supporter didn't and confused "work in progress" with "I need it now because I'm using it now in production in my daily job"
Yup, I’ve read it again, and I’m still pretty sure that you’re responding mostly to your own assumptions and not really responding to what was written :)
I very much enjoy Rust and write a fair bit of C++ as well. I don’t think Rust will replace C++ completely but it will definitely significantly eat into its market share for certain types of applications. In the past if you absolutely needed a “modern” language and needed performance you’d basically only go C++ (or maybe C).
IMO if you’re writing a highly concurrent or parallel application it’s hard to deny how much more productive and safer it is to write it in Rust. Optimizing cpu instructions in a single thread? Rust might just be an alternative and not competitive.
> C++, which is what Rust seemingly is designed to replace. It doesn't, as far as I can tell.
You can't replace C++:
1. Well-functioning, mission-critical systems that were made in C++ aren't being replaced.
2. Anything GPU and ML-related is still better solved in Python/C++.
3. Anything that depends on scalable, high-performance data structures, C++ still has a better selection.
What "replace C++" can mean is: There are use-cases where Rust and C++ overlap where the Rust story is maturing, and you can pick Rust in those cases instead of C++ if you're not already invested in C++. But either if you have existing C++ code that works well, or you have deep C++ knowledge, or you are in one of the categories where Rust is not on par yet, there's still no competition.
> 1. Well-functioning, mission-critical systems that were made in C++ aren't being replaced.
Very true. There is a lot of existing working code that is not going away and will even still need to be maintained for decades.
> 2. Anything GPU and ML-related is still better solved in Python/C++.
Can be done in Python/Rust
> 3. Anything that depends on scalable, high-performance data structures, C++ still has a better selection.
Why is that? Rust has equally high performance data structures. Even more perfortant in some cases (restrict by default, destructive move, easier concurrency)
There are definitely fringe edge cases where C++ can do things that Rust simply cannot, currently.
I'm thinking of things like Herb Sutter's deferred_heap (https://github.com/hsutter/gcpp) that give you GC-like abstraction. It's pretty cool that this is possible to write in vanilla C++ with decent ergonomics. I tried to make something similar awhile back in Rust and hit a wall in terms of making something that would be pleasant to use.
Rust has several on-going experiments with making a nice GC, along with good support for arenas. (If you can use arenas, they're great.)
Are you sure you're not confusing it with Swift, or perhaps running Rust in unoptimized debug mode (-O0)?
Rust goes as low-level as C (there's literally c2rust source-to-source translator). It gives you full control over all allocations and indirections. It's even more efficient than C++ in a few places, e.g. it has a more efficient ABI for unique_ptr, doesn't call destructors redundantly after a move, doesn't need isa pointer in structs with virtual methods.
Rust generates native code using LLVM, and even has a bit stricter semantics that allow better optimizations (notably immutability and mutable aliasing are stricter than in C and C++).
Same here. We do a lot of "trust me, it's OK" operations in the name of performance, like very complex ownership semantics, manually issuing memory barriers in multithreaded scenarios, etc. Can't do any of that in Rust. The borrow checker usually forces a different design which is conceptually simpler and has fewer assumptions baked in (you end up with quite a bit of Rc/Arc, RefCells, etc.) but sacrifices quite a lot of speed.
This has been my experience with Rust as well. The safety actually does cost some performance. Just like how C++'s "zero-cost" abstractions cost something.
It seems like a straightforward trade between speed and safety to me, but nobody in the Rust community wants to admit it. As such, there aren't a lot of public benchmarks of applications out there that can help the rest of us figure out when we actually should use rust - when the performance is worth the safety (no, Rust evangelists, that kind of safety is not always worth it, and no, we don't always pay the same costs for it).
There is a lot of C++ code out there. A lot of it was not written by people who understand performance or who care. It is very easy to take that kind of code and make it faster, even without a switch to Rust.
The examples I have seen where a Rust rewrite is faster than original C++ code usually have glaring performance issues in the C++: Often, it is code that does a ton of unnecessary copies. The Rust code often doesn't copy as much because the Rust language makes that kind of copying inconvenient or because the person doing the rewrite is a Rust expert and not a C++ expert. The other kind of speedup I have seen is when people swap a C++ std::map for a Rust btree_map, and don't realize that absl::btree_map (a C++ btree map in Google's library) is what Rust's btree_map is based on.
I haven't seen any examples where C or C++ code written with performance in mind gets a rewrite to Rust and goes faster, and I have seen the opposite.
> There is a lot of C++ code out there. A lot of it was not written by people who understand performance or who care. It is very easy to take that kind of code and make it faster, even without a switch to Rust.
Cliff Biffle's great series of blog posts called "Learn Rust the Dangerous Way" takes the fastest C program from a benchmark game entry, rewrites it naïvely in Rust using copious amounts of `unsafe`, and then transforms it into program without `unsafe` while keeping it idiomatic. The last version is faster than the C implementation.
I'll read it and take a look. I am aware that Rust does very well on the benchmarks game.
However, the benchmarks game (and microbenchmark-based comparisons in general) is hard to take seriously if you are thinking about application performance. Some languages microbenchmark very well, but don't translate that to system performance (C is the poster child of this effect), and some microbenchmark poorly but work very well in practical systems (Go is the most popular language with a big gap here, but some functional language like OCaml or Haskell probably has the biggest gap).
The reasons for these gaps can include things like it being harder to use the optimal data structure for your application (eg C code using red-black trees instead of btrees in 2023) and large code size causing terrible caching behavior (heavily templated C++). I also remember seeing something here about some non-optimal calling convention in the Rust compiler, which would be another thing that shows up in a system that doesn't in a microbenchmark.
Microbenchmarks are not a good replacement for system-level comparisons.
I think we're in agreement here. Personally, I was really looking forward to seeing "we rewrote our database in Rust from C++ and QPS improved 5% because fast code was easier to write." This kind of benchmark is not BS, and actually would save you real money.
Instead, the main arguments for the claim that Rust is the same speed as C++ are microbenchmarks, which actually are pretty useless when you are comparing very different implementations. The time they are useful is when you are refining an implementation.
If you want to scrutinize the benchmarks game even further, they use gcc for their c compiler, where clang would probably be a better choice since the benchmarks are arithmetic-heavy.
When you get sufficiently close to the metal, the performance gains seem to be coming from better algorithms rather than from the programming language. Like you are saying:
> The reasons for these gaps can include things like it being harder to use the optimal data structure for your application (eg C code using red-black trees instead of btrees in 2023)
Bryan Cantrill described this experience[1] and maybe that is exactly what you are referring to.
But if you're just looking for a language to get work done with, does it matter if Rust is faster because of better off-the-shelf data structures and algorithms or because of some inherent magic in the programming language?
Yes, because my current alternative (modern C++ with Folly/Absl) provides those data structures already, so there's no real benefit to using the new thing.
That's great for you! Then stick with your current setup.
I don't already know C++. For me, learning Rust is easier than learning safe C++. I imagine that there are many people in a similar situation for whom Rust makes more sense than C++. That doesn't mean that it makes more sense for everyone.
Enough to stop performance slippage due to the increased amount of code and context. Unfortunately, the unit size is intentionally vague. For a lot of HPC applications, it's a small unit since they do the same math over and over again on large vectors (effectively the same as the microbenchmark). For databases and programs with significant business logic, it's a huge unit, almost a full-system test.
For the companies that re-wrote their databases, login systems, and other similar things in Rust, a real benchmark comparison would be pretty easy, and they probably did it internally anyway. Hook one of your servers up to an artificial load generator and see how much it can take.
Few years back, Convey[1] has apparently outran HAProxy in an alleged benchmark by the author[2]. That's a one man project (now abandoned, sadly) outrunning a decade old product built by an enterprise company AND a big community AND spearheaded and designed by a data structure genius. Granted, only in one of many tricks HAProxy can pull, but still. Not a database but indeed a concurrent world-facing RealWork software. If true (didn't actually check myself), I'd say it fits your bill.
Personally, I read that as "can be as fast as, but without you having to be Willy Tarreau level genius" which is all I need.
It's easier/more-intuitive to do a lot of things in C++, but safe, high performing C++ is certainly harder than safe, high performing Rust for huge swaths of use-cases. Also, as has been mentioned, its type system that benefitted from the PL research since the 80s also allows for nicer expression of business logic. In particular, this means that in Rust, unlike C, Go, or even C++ in great part, you are not writing in the same low-level intricate language at every level of your stack i.e. it can be a nicer high-level experience the higher you go if you designed your lower tiers well.
And that last thing to me is the biggest advantage it has over the competition.
Off course, there is also the fact that juggling dependencies in a non-trivial C++ project was a nightmare until recently with vcpkg and it's manifest mode and that will take probably another decade to become commonplace in the ecosystem (if ever).
You've linked to an implementation full of hand-written SIMD, I'm not sure that's a great comparison one way or the other.
Which is my biggest gripe with benchmark games. There really ought to be strict categories for "straightforward, idiomatic code someone with 3 YoE could write", "optimized-but-still maintainable code an experienced senior engineer would write", and "unrestricted wizardry".
It seems inherently difficult to define strict categories around the soft constraints that you are listing.
But it would be interesting to try to approximate something like those categories. The project repo[1] contains the following call for idiomatic code:
Please, people ask to see more "idiomatic" programs —
- we already have enough exhaustively optimized Rust and C programs.
- we already have enough hand-written vector SIMD and "unsafe" programs.
Thank you.
>Often, it is code that does a ton of unnecessary copies.
Often, code written in C / C++ does unnecessary copies precisely because the risk of avoiding those copies is hours of debugging, crashes in production, or security flaws.
So this is a legitimate benefit of Rust. Being able to code more aggressively up front, without fear of something blowing up in your face later, or much later when the intern touches the wrong line.
>I haven't seen any examples where C or C++ code written with performance in mind gets a rewrite to Rust and goes faster, and I have seen the opposite.
The point he makes is that while Rust is not strictly faster than C in most cases, it gives you the comfort of being able to use more highly optimized libraries and data structures without needing to have the extreme level of trust in the author that you do in C / C++. This is basically the same argument.
This is a comparison with C that I am familiar with. It illustrates the reasons why C++ won for many applications (compared to C). The author also is on record as someone who hates C++, and is very careful about not making any claims in that blog post about C++ because that's not his area of expertise.
Compared to C, where a data structure library is at best a heap of cobbled-together macros that make a lot of assumptions about the underlying types, C++ has well-developed libraries for all of the data structures mentioned.
And you need to trust the author plenty to make a btree that works - that is a very complicated data structure. It's not just about memory safety, there can be a lot of functional bugs, too. The author of the C++ one is Google/Facebook, who I trust to write a solid library, while the author of the Rust one is someone who was inspired by the C++ version.
At this point, both are well-tested, but there are a lot of Rust libs that are in the same position, but relatively untested.
True: not my area of expertise as I have only written ~100,000 lines of C++ -- and that that is insufficient for C++ to be an area of expertise is itself very revealing...
It's true, and it is a big problem with C++. I'm not sure if you were being sarcastic here, but if those 100,000 lines are not heavily back-weighted towards 2014 and later, that amount of code probably doesn't make you a C++ expert unless you were doing things like writing parts of boost (or another template library) or you happened to decide to read a few hundred pages of the spec recently.
The language-change whiplash that people accuse Rust of happens to C++ folks who leave the language for several years, since "idiomatic" C++ has changed substantially over the last 20 years.
EDIT - Just to clarify, there are SO MANY different ways to do things in C++ that the way to really learn C++ is by reading (both good code and the spec), not by writing. It's very easy to write a ton of code and not actually do it in the fastest/best/most idiomatic way. This is complicated by the fact that a ton of C++ examples out there are wrong. It's really an electrical engineer's language at this point - read all the datasheets if you want to avoid getting shot in the foot (and still get shot in the foot anyway).
> or because the person doing the rewrite is a Rust expert and not a C++ expert
On this subject... it's much easier to become a Rust expert than a C++ expert.
I do expect the world to have more Rust experts than C++ ones today, even with much lower usage of the language and much less time for people to learn it.
Good point, but what is the comparative effort for both of those rewrites making similar performance improvements, and what skill level in the coder is needed to realize those similar improvements?
E.g., will the C++ rewrite for X improvement take longer, or require a more highly skilled programmer, or would the Rust programmer require some high level of expertise to yield the results but a junior C++ guy/gal could knock it out quickly?
Has anyone done real comparisons of this sort, such as taking software package X and assigning a performance rewrite to a junior Rust dev, a senor Rust dev, a junior C++ dev and a senior C++ dev, giving each a couple weeks then checking the yielded performance?
Also, how does this interact on teams? It's been a long time since I've written any C++, and I've not tried Rust, but it seems some of the Rust safety features make it harder for devs to step on each other's toes.
Parent may have meant that the second implementation might perform better, even if it’s in the same language, because you learn from the first implementation.
I thought the safety was compile time optimization and semantic analysis, to enable code generated to feature performance optimization that reflects in improved performance at runtime.
I am not enough of a Rust compiler expert to tell you exactly where the slowdowns show up, but the common theme has been that a benchmark of a small section of the critical path shows equal performance (sometimes a little faster, sometimes a little slower), and then when you expand to a larger scope, benchmarks start to slow down in comparison to their C++ alternatives.
My intuitive guesses would be that the culprit is actually either larger code size, worse code layout, being unable to optimize certain code sections to the same level (undefined behavior in C++ is actually helpful for speed, and not all of it is actually unsafe), something about how calling conventions or struct layouts differ, or another "second-order" effect.
You're probably running Rust code in debug mode, or something else is going on. Without more info, hard to say. But for tasks where the purpose is to write the fastest code possible, Rust is very competitive and regularly outperforms C and C++. Sure there are perf differences, but 10-100x figures are unheard of unless you're doing something really wrong.
Having written both my sense is that C++ will indeed be ultimately faster, but with a lot of work. You can get pretty close with Rust, with less pulling your hair out, and much faster coding time. Whether that's worth the difference to a fully optimized C++ depends on what you're up to.
In Github's analysis of fasting growing language they state that Hashicorp Configuration Language was the fastest from 2021 to 2022. In 2021 from Jetbrains it is in the top five of languages to adopt and or replace. I would like to see the data that MIT is using to figure out their thesis.
That's a bit pedantic. HCL is not a language you can ever use to write or distribute a program. It's for configuration files. This is almost like saying "YAML is the fastest growing language" few years ago. It doesn't contribute to a healthy discussion.
MIT can very well use the exact same data, where it shows Rust at the top at 50.5% (excluding config languages like HCL).
I've been writing some typescript lately. I had gotten a bit bored about Rust but oh man. The weakness of TS makes me really appreciate the sort of invisible things Rust does right.
I would love to know how you build a language that:
A) is easy to write one off scripts that do a job fast, with minimal thinking and effort. I am thinking of Python and Ruby. For me I can write code with high velocity in these languages.
B) Executes loops very fast and does automatic vectorization
C) Can scale toward large teams, such as that you can do drastic changes with faith that things shall not break at runtime.
D) zero cost abstractions and minimal indirections.
E) predictable runtime performance, no random pauses
F) powerful tooling, such as package manager and IDE integration
> A) is easy to write one off scripts that do a job fast, with minimal thinking and effort. I am thinking of Python and Ruby. For me I can write code with high velocity in these languages.
Once I wrap up some other projects, I plan to explore this space a little bit within Rust.
> imo the biggest bang for the buck is just having good `#!` support.
Shebangs don't really make sense for AOT compiled languages and I don't know why you would really want one, but you can add `#!/usr/bin/env -S cargo run` to the top of a Cargo.toml file today and it will "work."
I just think the premise of "why didn't you write your last script in Rust" is flawed. It's a systems programming language, if I'm writing scripts I don't really care about the same things - like static typing or meticulous error handling.
It sounds like high effort and low reward imo.
A REPL though (not a true one necessarily - but live-reloading of dynamic rlibs and using dynamic dispatch in place of monomorphization for REPL-builds) would be extremely useful as a workflow tool.
It makes a big difference when you share a code sample, whether in a blog or on an issue, to have a single block of text to copy/paste including dependency declarations.
It also lowers the barrier for experimentation. I have a directory with probably 40+ cargo-script's for one-off reproduction cases of bugs and it is much easier to work with than juggling multiple files and the overhead associated with it.
I think Kotlin is pretty close to that albeit not satisfying (D) and only satisfying (E) if you use a pauseless GC (which you can do, it's just one command line flag). The pervasive use of type inference gives it a Pythonic feel.
My company has a private tool called hshell that's completely replaced bash for my own use. It uses Kotlin Scripting to provide hashbang script-style development along with a UNIX-like API with functions like cp, cd, ls, find, tar, zip, ssh etc. You can also do things like print markdown to the console and it'll be formatted, and it understands how to draw progress bars from various sources like file copies or loop iterators.
to the top of the file, IntelliJ can give full intellisense and refactoring support for these scripts, they're portable, you can define a CLI by annotating variables and there's a variety of other useful features.
As it runs on the JVM you also get auto-vectorization, although why you'd want that for one-off scripts I'm not quite sure.
It doesn't have zero cost abstractions. As you note, combining all those features together is rather hard if you want to keep a lightweight scripting feel.
Overall it's pretty nice. I wouldn't want to go back to bash or python, the ergonomics of hshell are far better. I'm not sure what to do with it though, maybe release it as a product (but at what price? is there any demand?). I'm pretty sure I don't want to sign up for open source maintenance duties right now.
Good questions. Clear, organized. Please share what you find as you compare existing languages.
Regarding (F) -- this is partly about (a) the language's interop story and partly about (b) building a community of people that gets along, get things done, and strikes a balance being independent enough to generate something useful and different while being open to compromising for industry adoption (which often means "slowing down" and stabilizing when the time is right).
(a) There are a lot of great ways to ease editor integration. One is to build/maintain language server / analyzer. See https://rust-analyzer.github.io. Another is to share and maintain a formal grammar.
(b) I've been very pleased with my interactions in the Rust community in this regard. There are certainly many ingredients to mix in ways that make sense for a particular community. In my view, the key aspects are (i) be clear about what your language community stands for; (ii) strike a balance between fairness, transparency, and privacy in how you get there; (iii) build in accountability and feedback loops
If you want the best possible abstraction and developer-scalability, you won't get the best possible performance. In fact, between the quickest start, largest developer-scalability, and best performance, you can only pick one.
> “In C or C++ you always have this fear that your code will just randomly explode,” says Mara Bos, cofounder of the drone firm Fusion Engineering and head of Rust’s library team.
That sounds like a completely valid reason to abandon C++ over Rust.
I mean, what self-respecting programmer that keeps up to date with the trends would use a such a dangerous language that makes code blow up in their face?
That is so '80s.
Well, I am joking, of course, but that comment is hard to take seriously.
Yes, in C or C++, code will just randomly explode. That's a completely accurate description.
Once upon a time we had reports from our users that our software had deleted all of the files it could on their C:\ drive. Investigating, we found the code that caused it, something to the effect of:
Want to guess what happened? We had a mix of Emacs and Microsoft Visual Studio 6 developers, and somehow we ended up with a mix of carriage return / line feed in that source file. Both Emacs and the MSVS6 IDE showed those as three separate lines... but the MSVS6 compiler thought that the comment... had no terminator. So it took the assignment on the next line as just being part of the comment.
You bet your ass I worry about code blowing up in my face. I write unit tests to try to protect myself.
I can't tell you how many times whatever the "undefined behavior" was what the code depended on, and fixing it required some severe re-working.
I also can't tell you how often someone thought "const" meant "completely immutable," and then they went on to accidentally modify what the const pointer was pointing at.
Or how often I've had to fix someone else's race condition. Or how often a driver crash has clobbered me.
I came across functionally like this in our code:
int i = 2;
int b = 3;
printf("i = %d, b = %d\n", (i, b));
Want to guess what that does? (i, b) evaluates i, and then ignores it, and returns b. So it basically prints "i = 3, b = [core dump.]" The MSVS6 compiler gave a nice juicy warning on that line that everyone ignored. Yes, using the MSVS6 compiler was dumb, and ignoring the warnings was dumb, but guess what? I'm not in charge of how dumb my co-workers are, or how strict everyone's deadlines are, and bugs like this are actually kind of hard to find in a real code base with tons of problems going on.
I'll take some more predictable behavior, if someone's offering it; yes, please.
-Wall -Werror is your friend in C and C++ (even -Wextra sometimes). Zero-error policies are generally good. The spec in each language allows a lot of crazy undefined behavior.
Conversely, the rust spec defines... nothing. Rust is specified as "whatever rustc does." That means the compiler is free to be arbitrary on this sort of thing. When there is a spec and there are competing implementations, there will be behavior divergence, and that will create these kinds of issues. Not having a spec can be a good thing.
I'd love it if you could convince my old board of directors of that. But their first step would probably be to fire everyone who had survived the nightmare, because they "didn't already address it."
There are a lot of places where CVEs don't matter. The HFT firm I worked at groaned over Spectre and Meltdown mitigations, because now our CPUs will be slowed down by not using unsafe behavior. Similarly, I bet a lot of video game folks also didn't like them.
If you aren't connecting to the network or running in any sort of privileged mode, the space of attacks that you need to care about is pretty limited.
Ironically, the answer to a lot of unsafe input in Rust is to crash (if your error handling logic didn't catch it), which is something that you likely care about a lot. Most players care a lot less about corruption of the internal state of your video game than they do about crashes.
You can disable such mitigations, which I'm struggling to credibly imagine inside a universe where an HFT firm (that is all about optimisation) would not know about.
The silicon did get slower due to a few mitigations, and in 2018, Intel was thinking about forcing a lot more of them on us in silicon (eg the equivalent of retpolines, but in silicon). The HFT industry and some HPC folks (and I assume some people from gaming) were involved in convincing Intel to keep those mitigations in software.
EDIT: One fun fact here is that AMD never did the unsafe performance optimizations that Intel had to remove from their silicon. Spectre and Meltdown mitigations were instrumental in closing the gap between Intel and AMD.
Corruption of internal state is generally what leads to crashes.
But yes, having worked in service games for over a decade now, clients are treated as untrusted agents regardless. CVEs that result in control over local execution are mostly not interesting. At least for the purpose of maintaining the game; platform holders _do_ care, mostly on fears of piracy.
Galoob didn't cause revenue issues with Game Genie in the 90's. And revenue is the thing what matters.
>For some projects, like browsers, that's a big deal, for others, like games, not so much.
Cheating in games is a big problem, and lots of money is spent addressing it, not to mention techniques which are barely distinct from spyware which cause performance reductions on the side.
A lot of that cheating relies on bog standard memory bugs which game devs don't care about, apparently.
> I find the whole C++ bashing bizarre. […] You can explode your code in any language you desire
Some of us have spent decades working as sysadmins, getting woken up in the middle of the night to deal with whatever the latest CVE is... Do kids these days even know about codered, heartbleed, etc?
When you're working with software controlling lithium batteries, things literally blowing up in your face because of code mistakes are an actual concern.
Things figuratively blowing up in your face are a concern even when not working on batteries.
I just wish it was simpler. I never saw it in its infancy, but from what I can gather, it seems a huge amount of bloat was added to the original design.
I'm biased as a Rust fan, but I don't think this is accurate. I think this impression comes from people who don't know Rust reading the release notes and assuming that the rate of new features must mean the language is full of old legacy junk that no one recommends using anymore. But there are actually very few things like that in standard Rust. (The old `try` macro and the `ref` keyword are two examples. std::sync::mpsc is maybe another.) A large portion of the changes have been fleshing out APIs and removing limitations, which arguably makes the language less complicated over time. "Nonlexical lifetimes" and "match ergonomics" are two examples of this, which each took a lot of work.
Thanks to you and the other commenter. It seems my perception of the timeline is indeed incorrect, although I'm not sure my view of its complexity is. I have found it difficult to dive into but want to give it another whirl here at some point.
There's definitely a learning curve, no doubt about that. I usually advise learners to expect to read a book. It doesn't have to be The Book, and there are several other good ones to choose from, but expect a book. For most people, learning from examples and compiler errors isn't the way to go.
I've programmed in Rust since 0.7 and I can assure you that the Rust today is the leanest, easiest to learn version of the language.
Since 1.0 Rust has added ability to use arrays larger than 32 elements. That's not bloat, that's a fix of an embarrassingly large TODO left in 1.0.
Rust added a new module system, which most people find easier to use than the original one. The new one technically does more, but these are things that everyone expected it to do anyway (e.g. in Rust 1.0 use of `std` in code worked only in some files, sometimes, because reasons. Now it works in every file, even though technically it's a more complicated name lookup).
Rust has added a new borrow checker. The implementation is waaay more complicated than the original scope-based one, but the result for the end user is that it mostly just works. In Rust 1.0 lots of borrowing didn't work for dumb reasons. You used to have to declare variables precisely in a reverse order of their destruction, or the code wouldn't compile. You often had to add extra {} around lines of code that mutated collections, because the simplistic borrow checker didn't understand when the mutation ended.
Even async, which itself is a big new feature, simplified networking code a lot. I've written a network service in Rust before async, and it was a pain, and run-time overhead, to deal with ownership in callback closures that required manual reference counting and didn't allow references. Async in Rust still has its quirks and limitations, but Rust 1.0 had all of them, and then some more.
Most importantly, Rust error messages have massively improved since 1.0. Things that used to be gotchas where people were getting completely stuck (such as needing .as_ref() on Option) are now pointed out by the compiler.
The compiler is now much faster, and generates better code. Rust has matured, and almost everything it has added since 1.0 should have been in 1.0.
The title of the article is actually, "How Rust went from a side project to the world’s most-loved programming language" .. which is so annoying.
"Most-loved" cannot be measured, it's so subjective. Why couldn't they have just said, "fastest growing" or something else? Does everything from media have to be designed to grab eyes, even out of the MIT Technology Review for godssake?
There is a C ssl library and a rust ssl library, which one do you choose?
There is also a http library I C and rust , which one do you choose?
If C ones are battle tested, I will not choose rust for a simple reason, software is error prone in nature, and believe certain practices will fundamentally change that is just selling snake oil.
> The obligatory LGBTQ nonsense, “Rust has more trans and queer coders” is so tiring. Is there some 4chan-esque programming language that actually discourages queer people from using their language? In fact, is there any programming language forum that’s not friendly to “n00bs”. My guess is every language is friendly to “n00bs” in the forums and this doesn’t make Rust any more special than other languages (Note this has nothing to do with whether the language is actually easy to pick up, only whether people on forums are friendly to beginners)
I assume you're posing a rhetorical question (implying that no languages are better or worse in this regard) rather than saying you actually want a language that's intentionally unfriendly to queer people. In that case:
The relevant part in the article is quoting someone saying that because the Rust community adopted a code of conduct prohibiting harassment and took the attitude that no question is stupid it attracted more queer and trans coders and n00bs. Are you saying that you don't think it's possible that this type of code of conduct can make a forum more friendly? It doesn't seem that unreasonable (or some sort of outrage of wokeness run amock) to think that queer/trans coders are going to prefer a language where they know the community isn't going to harass them.
>It doesn't seem that unreasonable .. to think that queer/trans coders are going to prefer a language where they know the community isn't going to harass them.
I'm struggling to understand why interacting with an online programming community about aspects of a programming language requires you to announce your sexual preferences or gender identity?
so it's funny to me that they are still called "communities" but everyone has its own strong individual identity and you have to respect it (meaning you need to aknowledge it) while talking about something completely different.
It's like going to the gas station and instead of asking "how much" for the gas you start a conversation like "I am bald. Do you like bald people? Is this gas station aware of bald people existence? Do you recognize us as people? Do you stand by our side or not?" review "Did not speak my language. Not bald. Probably racist. Won't come again."
Anyway:
HCL is the fastest growing language on GitHub, faster than Rust and TypeScript
In my experience, it's less that there's specific languages to avoid, and more that there's a general baseline of unpleasantness on the internet towards those folk. As a result, they tend to steer towards the islands where that doesn't occur.
The point is that there's less of a repressive force from some areas and more of an attractive force to others.
There isn't a list of "These are the problem languages", it's an experience of "Oh, this community has a CoC that protects me and has a culture I'm OK with".
I am capable of making assumptions by myself, I fail to see how these assumptions are relevant in the real World, where I never observed them except for single cases that are everywhere and happen more or less with the same frequency.
I can't take an hypothetical as something that should convince me to use Rust because "some other community is worse in my fictional story"
The point isn't to convince you to use rust, it's to explain based on my experience why the phenomenon that's been observed occurred.
I can say from my personal life that I have avoided or actively gone to stores based on similar perceptions, and tend to analyze the same information whenever joining a new online community.
"This thread came as my response to this post by CppCon. Nothing has changed. They invited a person convicted of possession of Child Sexual Abuse Imagery to a conference with a child care program. Where people are encouraged to bring their children.
https://cppcon.org/2022dipost/"
(note that this is a much higher bar than any mere "cancellation", the person in question has a criminal conviction)
This needs to be emphasized. The internet wasn't wildly different when Rust was introduced, but it was a different time. Communities devoted to technical things (like programming languages) were harder to find, and generally felt smaller. On a cultural level, Me-Too and Trump did a lot to bring what OP labeled "obligatory LGBTQ nonsense" to the forefront. Before that, LGBTQ people did face a more casually open sense of hostility on a regular basis.
There is a direct line between fostering an environment that is open to all sorts of marginalized groups, the people joining that environment being more inclined to help out with the core project, and the core project successfully growing a large user base.
Every rust post brings out one of these. Where's the evidence? The linked article doesn't even touch on on it. rust-lang.org isn't a big rainbow flag or anything.
At most people seem to be offended by the community code of conduct that sort of lays out the golden rule with some added verbage for the thick skulls out there?
Even if it was a big rainbow flag at rust-lang.org, who the hell cares? The language is fantastic and solves real problems.
> Every rust post brings out one of these. Where's the evidence? The linked article doesn't even touch on on it. rust-lang.org isn't a big rainbow flag or anything.
The article actually does touch on it, but it's very easy to miss since it's just mentioned in passing as one example in a small section noting ways in which the language tried to be build "a culture that was known for being unusually friendly and open to newcomers," which kind of makes it even weirder that someone would be so triggered that they would need to post a whole rant about it and dismiss the entire article as "obligatory LGBTQ nonsense":
> Along the way, the Rust community was also building a culture that was known for being unusually friendly and open to newcomers. “No one ever calls you a noob,” says Nell Shamrell-Harrington, a principal engineer at Microsoft who at the time worked on Rust at Mozilla. “No question is considered a stupid question.”
> Part of this, she says, is that Hoare had very early on posted a “code of conduct,” prohibiting harassment, that anyone contributing to Rust was expected to adhere to. The community embraced it, and that, longtime Rust community members say, drew queer and trans coders to get involved in Rust in higher proportions than you’d find with other languages. Even the error messages that the compiler creates when the coder makes a mistake are unusually solicitous; they describe the error, and also politely suggest how to fix it.
> “The C and C++ compiler[s], when I make mistakes, make me feel like a terrible person,” Shamrell-Harrington says with a laugh. “The Rust compiler is more like it’s guiding you to write super-safe code.”
Yes, I would say that in light of recent happenings in the C++ Foundation board and committee ANYONE would be discouraged from participating in them, and by extension, the language.
And I would expect that women and persons LGBTQ, having a vastly higher chance of having been exposed to sexual predation, would be much more wary of this and pick up on such red flags much faster. Thus, understandably staying away from the language and being drawn to a more inclusive community like that of Rust.
It's actually notable because it's a visible side-effect of the fact that Rust is attracting lots of young n00b coders from the Python, Ruby and JS/TS ecosystems, not just C/C++ greybeards. It's successfully spanning the divide between systems-level and higher-level languages.
I guess it proves a strategic point about championing diversity. People often say that you hamstring a project by putting queer stuff front and center, alienating 'normal people'. In reality, 'normal people' are pretty rare, so it's typically good strategy (as well as good ethics) to be unappologetically inclusive.
> People often say that you hamstring a project by putting queer stuff front and center, alienating 'normal people'.
I would say that it risks alienating people who are tired of finding politics injected into every corner of their life. And when you have to spell out carefully how inclusive you are, be sure you actually include everyone. That list gets pretty long.
If you look at any one trait, you might be able to pick the "normal" variation of that trait by its frequency and yeah, in that case it would not be rare.
But when you combine traits, the probabilities multiply and that quickly reduces your probability of being "normal" with regard to all traits. For example, if two independent variable "normal" traits have a 2/3 probability, for example, the probability of being normal with regards to both traits is 4/9--already below 50%.
The definition of the word normal. The argument you're making here is merely playing with numbers, it has no intellectual merit. Normalness is not defined mathematically, and doesn't mean the intersection of an infinite number of arbitrarily selected traits. The meaning is clear to anyone who speaks plain English.
I'm aware of how you're defining it, I just don't care.
Don't bother with whatever appeal to authority you're about to search for: I don't care what the dictionary says, either. Dictionaries describe how people use words, and sometimes the way people use words changes faster than the dictionary can keep up. People using words are the source of information in the dictionary, and viewing the dictionary as an authority on meaning is a fundamental misunderstanding of how dictionaries work.
> Normalness is not defined mathematically, and doesn't mean the intersection of an infinite number of arbitrarily selected traits.
Normalness is defined by the common understanding of the people who use the word to communicate. Usage determines definition, not the other way around.
If you understood what the person meant enough to know they're using a word in a way that you aren't used to, then communication occurred.
If you really want to fight about this particular dictionary definition not being mathematical, you should really look up the history of the word, because you're very wrong about that. But that's rather silly, because usage determines what the definition is, not the other way around.
> The meaning is clear to anyone who speaks plain English.
True! Which is why it's bizarre that you would insist on a different meaning when everyone, including you, understands what is being said.
Yes, and in the context of pasabagi's post he's using the word normal to mean non-queer/straight people, who are in fact not rare, they're the vast majority.
I was actually punning a bit on the word 'queer'. The point is, 'normal', as in straight, able-bodied, male, white, neurotypical, etc, is a minority - except in terms of representation, political or otherwise. This becomes a majority by creative coaltion building (white people, male people, neurotypical people, non-disabled, etc). You then turn around to the parts of the venn diagram that don't carry all of these traits, then treat them as a group who's concerns are marginal and not generally relevant.
My guy, two posts ago you were telling me what pasabagi meant by "normal", and then when pasabagi themself comes to tell you what they meant by "normal", you're still fighting them on it?
If you just want to have an argument with someone, why don't you start an argument with someone who isn't trying to represent the underrepresented?
> Nobody defines normal as meaning specifically male, women are the majority of the population.
Not explicitly, but there are a lot of cases where female interests are treated as deviations from the norm.
I thought you said "normalness isn't defined mathematically". Which is it: do we define normal based on numbers (i.e. the majority) or not? Or does the defintion of normal just shift in whatever way is convenient for you to pretend your viewpoint is the majority viewpoint?
Eh, I honestly thought it was common sense. Everybody gets disabled at some point in their lives, yet we treat disabled people as a special edge case. Women's health problems don't get as much funding or attention, because they're also treated as a special case. 'Normal' is and always has been a political category, not a physical one.
> Is there some 4chan-esque programming language that actually discourages queer people from using their language?
There isn't. I find that any organizations priding itself in employing (X) group of people is being disingenious at best, and preferential at worst.
Personal traits are irrelevant in a professional setting, let alone an online one where they are hidden away behind written communication. So to actively seek them out, or worse, place them on a pedestal, only signals that you reward personal traits over actual work ethic and ability. You're essentially taking pride in what your employees are rather than what your employees can do... Such organizations tend to go down a purity spiral of policing human interaction and speech, as well as preferential treatment of employees, to the detriment of product quality. Often implementing measures that are low in merit but high in potential for abuse, such as forcing contributors to sign vague agreements[1], or banning "offensive" numbers[2] from being used.
I wouldn't trust such an organization with my machine or data. The comments in the article do nothing but instill doubt in Rust's quality as a product and its future.
1. nobody is required to "sign" the code of conduct
2. That does not ban any numbers from being used. That is a lint in the compiler's codebase where they do not wish to have those being used in examples. only the compiler.
and why 0xCAFED00D? if this is a reference to something I don't get it and probably it shouldn't be there if it's not so popular that anybody knows about it.
the best I could find about it is "("cafe dude") is used by Java as a magic number for their pack200 compression."
Once again, I'm not involved in any way, so my opinion is irrelevant.
But sure: I don't know Italian slang. However, a project that conducts both itself and its examples in English not being comprehensive about possible meanings in other languages isn't exactly a gotcha. Furthermore, the example is suggesting that you don't pick at random, but instead something that is obviously not a word, like 0x12345678. It's not suggesting that you make something up.
Furthermore, this isn't something about trying to be "absolutely pure" or something. It's a helpful lint to catch common cases. Nothing more, nothing less.
Rust is definitely more friendly to noobs than is typical because they are obsessive over inclusivity. The compiler warnings are the most visible example of this, because it shows you exactly where the problem is, offers a deeper explanation, and will often suggest a fix. That's a significant undertaking, and the only other language I know of with anything similar is the latest Python.
I've had Steve Klabnik calling me out more times than I can count because of the things I wrote on Reddit.
It's not more inclusive than anything else, it's just more like an hive mind
People try Rust because they think it will get them better salaries and automatically "safe programs"
They will flock away as soon as something else newer appears.
OTOH I've been part of the Elixir community since the beginning and I spoke personally with the creator Jose Valim and Erlang creator Joe Armstrong and they turned out to be the most welcoming humble uncredited geniuses among programming language creators who also build two of the best curated and functioning communities there are. No bullshit, high quality software, organic growth, tooling etc Rust in comparison is just a C++ community with younger and somewhat ageist/elitist members, with diminishing results over time (yes, they are rewriting stuff that already works, WOW!)
how about improve the tooling?
I've watched the video on Matrix 2 at FOSSDEM and when the speaker says they wrote the new client (ElementX) using Rust SDK people applauded for no reason and he said: "Rust fans as usual"
It's actually good to support marginalized people. While I don't identify as LGBTQ, I have experienced elitism and disrespect in IT jobs and communities and have certainly seen my share of racism and sexism.
A community that ensures kindness and compassion for marginalized members is built into their DNA will probably have that same kindness and compassion for everyone. Literally no downside to elevating LGBTQ members as far as I can tell.
I am wondering when LGBT people will get tired by using them as a cheap marketing object, previously animals rights, being eco-friendly served this purpose. Now, if you want some free publicity, just tell you are gay friendly. Put a rainbow logo on you website (costs 10 min of work), employ someone who is gay and you are done, you are modern and progressive. It helps, even if you are big corp that forces its workers to pee into bottles since you punish going to toilet.
Next, I am wondering when LGBT people will get tired by all of those who pretend to be LGBT people just to earn some discount, conference invitation, being better treated at school or work. Old French movie about a guy pretending to be a gay to avoid being fired will no longer be just a funny movie with 50% people claiming they are LGBT.
I would think that given Rust is not really a corporate-backed project or trying to upsell you on Rust Pro, Big Rust is not making anyone pee in bottles.
Are people even reading the article before they comment on Hacker News with a bunch of trite culture war outrage these days?
The LGBT part is barely mentioned (one sentence of a couple paragraphs on the code of conduct), and the overall motif of this point (relatively minor in the article's context) is less "we are propping LGBT coders" and more "we attract a more diverse community because we discourage toxic assholes".
This ignores that the “LGBTQ” community and politics around it and support/non-support for it can become toxic. See things like pronouns, renaming of male/female terminology, minor disputes that are blown out of proportion because someone wants attention.
So much of coding culture needs to chill out and the tools should be let to diversity into different communities, if people wanna fight about politics/gender/inclusion/etc, they should use the appropriate channels for that instead of putting everyone into a situation where you have to choose sides.
While I do think there is some degree of clout-seeking in how grievances often get aired, I don't think it's fair to tell people to "use the appropriate channels" -- that's just a way for issues to get ignored and for the status quo to get upheld where in a lot of cases the status quo is not working for many people in the LGBTQ community. It's really only through this kind of very in-your-face way that anything changes. Whether you would like those changes or not is up to you, but it's not fair to tell people to stick to neat little siloes so people can free to just ignore their issues easily.
> I don't think it's fair to tell people to "use the appropriate channels" -- that's just a way for issues to get ignored and for the status quo to get upheld
No, it's a way for programmers to keep doing programming with less distractions, of which we already have a metric ton.
Forgive me if I don't want politics in my work. If I want to do political activism then I know where to go. Unlike many others, you included, apparently.
People who hate those who are different are out there, that's known. But they will not be defeated by annoying literally everyone else.
I know. You'll say something along the lines that not taking a side is defending bullies or whatever. These arguments are extremely tiring and borderline fanatic, so I'm just passing by to give you the perspective of a fairly average guy, and peace out.
It seems people will never learn that shoving stuff in others' faces is not supporting their cause. Oh well. History will keep repeating itself then.
Rosa Parks really should have just sat down in the right spot on the bus instead of shoving her politics in people's faces. There's a time and place for activism but it's definitely not the bus – I have to use it to get to work!
Political activism is not effective unless it inconveniences people, because otherwise people can freely ignore it and sequester it as something they don't have to care about.
If you insist on people bringing politics into work, you will open the floodgates to so many problems. And just because some group of activists is supported by management doesn’t mean it will always be the case, and you’re just supporting workplace bullying then of those who do not wish to conform to the activist culture.
Business and professionalism concerns would make it expedient to avoid bringing political conflicts into the workplace.
There is plenty of time to organize, advocate, etc without inflaming the workplace and creating a toxic environment.
It's like going to a Wendy's and instead of asking "how much for the burger" you start a conversation like "I am bald. Do you like bald people? Is this Wendy's aware of bald people existence? Do you recognize us as people? Do you stand by our side or not?"
Then you review the Wendy's: "Did not speak my language. Not bald. Probably racist. Won't come again."
To me it seems LGBTQ people are drawn to interesting things that differ from the mainstream. A small, weird, underestimated project which becomes a mainstream success parallels their own identity struggle.
Not that these project become/are successful but the fact that LGBTQ people enjoy helping to make it successful. e.g. "Everyone thought this software was weird, scary and looked down upon it but we helped prove that it is a wonderful thing once you get to know it"
> but the fact that LGBTQ people enjoy helping to make it successful.
can we stop idolizing things that are so common, they're boring?
I've worked with LGBTQ people in software engineering before the LGBTQ acronym was even invented and they were 10 to 20 years older than me already in the 90s...
Why there are people pretending that the World did not exist and had different rules before Rust?
I think the argument is that OP wants the hammer factory to focus on making hammers. Reality is more subtle than that, but I have some sympathy for the argument.
When the Ukraine war broke out last year, several popular Rust packages were eager to proselytize about how much they support Ukraine. That turned me off. Apparently there is also a lot of LGBTQ activism in the official Rust foundation too, but I don't know much about it.
As a gay man, I'm not especially concerned about the hammer factory being unwelcoming, but I'd still rather work for the one that has a non-discrimination policy and where I see employees are open about their sexuality.
Do I mention going on a trip with my husband, or my spouse? That's what I mean about being open about my sexuality. Most people don't give that sort of thing a second thought.
We've banned this account for breaking the site guidelines and ignoring our requests to stop.
If you don't want to be banned, you're welcome to email hn@ycombinator.com and give us reason to believe that you'll follow the rules in the future. They're here: https://news.ycombinator.com/newsguidelines.html.