> We see no realistic path for an evolution of C++ into a language with rigorous memory safety guarantees that include temporal safety.
The point Herb was making is that "rigorous memory safety" isn't the only bar, nor should it be. Saying there is no way to make C++ have rigorous memory safety is not the same as saying C++ can never be made safe.
> I have to say, this sentence annoys the heck out of me.
> Old code that can't be understood needs to be rewritten anyway.
I don't think it's that old code can't be understood, you can always understand what code is doing mechanically.
it's a question of can you predict the consequences. For example, if I rename this database column, what in our systems that have been built over the last 30+ years will explode? That's data rather than code, but the underlying idea is the same.
What happens is the very act of rewriting it puts you at risk of adverse effects.
What's the overall steelman here? Something like, "Rust isn't a silver bullet. Let's make a subset of a superset of C++ that's safe." Okay, cool. Sounds great and the attempt at making C++ without breaking existing users certainly seems like an uncontroversially good thing to try. I don't really have any response to that.
I don't need a "lense of negativity" to criticize the details of Sutter's argument. This part of the article in particular would be a lot weaker overall if Sutter presented the reality than some sloppy non-sequitur. And that actually matters for his argument because it cuts to the heart of just how big of a trade-off Rust really is. If it isn't as big as he seems to be suggesting, then Rust's value proposition gets stronger. It's a critical flaw in his reasoning.
Steelmanning is great and we should try to do that. But I don't actually see a stronger version of Sutter's argument here. It's not just a phrasing issue, although the phrasing is certainly what jumped out to me. And I could just as easily say that the problem with Sutter's article is that he isn't doing a very good job of steelmanning the case for Rust. Whoop de do.
> Of the remaining crates which use "unsafe", the unsafe code is often contained to a tiny percentage of the code, so if we're looking at the overall amount of unsafe code, you're going from 100%, to a fraction of a percent.
I dislike this argument because rust unsafe code is typically placed into a module with safe code around it protecting it.
Guess how good C++ code is written?
exactly. The unsafe keyword certainly helps but is not a panacea nor a guarantee given that a bug in the safe code that's protecting the unsafe code could be the root cause for security issues, even if it manifests itself in the unsafe code.
C++ can't generally encapsulate safety. Rust can generally encapsulate safety. That's the essential difference. It's true that the boundaries of safety in Rust extend to the scope of the module in which `unsafe` is used, but C++ has no boundaries at all.
> but is not a panacea
Do you have a source to literally anyone credible saying Rust is a panacea?
> nor a guarantee given that a bug in the safe code that's protecting the unsafe code could be the root cause for security issues, even if it manifests itself in the unsafe code.
This just in. Rust doesn't guarantee bug-free code. Holy shit. What a revelation! Like, really? That's your problem with the argument? That it doesn't live up to the standard that bugs can't exist?
The value proposition of Rust has, is and always will be that it can encapsulate a core of `unsafe` usages in a safe API with no or very little overhead. The promise of Rust is that this power of encapsulation will lead to less undefined behavior overall. Not that it literally makes it impossible because, well, yes, you can have bugs in `unsafe`!
To head off the pedants, yes, not everything can be safely encapsulated in Rust. File backed memory maps are my favorite example. And yes, bugs in not just `unsafe` blocks but bugs in the language implementation itself can lead to UB.
And yes, Rust achieves this through a trade off. As Sutter mentioned. None of this should be controversial. But what Sutter misses is a fair analysis of this trade off IMO. He does a poor job at representing its essential characteristics.
how would you define concurrent access to a filesystem?
That's a serious question, if I open a file for reading and another process writes to it, exactly how is the C++ standards supposed to protect against that?
If a company won't update because it really needs to depend on whether one bool compares greater to another then by all means they can stick to using 20 year old version of C++.
Other companies with modern engineering disciplines that don't write hacky code like that can benefit from a sane and sensible compiler instead of being dragged down.
Too late for that, I am in a decision making position at a quant firm with very strict engineering standards and I absolutely stand by my decision that businesses that write code that compares booleans together like that should not be in a position to hold back other businesses that don't.
They can continue using 20 year old compilers and quit making the language worse for the rest of us who have put in the effort and cost of writing modern software.
I agree, asking everyone else to pay the cost of writing error prone code because they refuse to adapt but yet feel entitled to use new compilers is a big red flag and poor technical decision making that offloads the cost on the rest of the community.
I'm glad we managed to get that out of the way.
Companies that wish to stick with their existing and deprecated coding standards can stick to their existing and deprecated compilers, allowing those of is who wish to have safe and modern tools the freedom to make progress without their baggage holding us back.
oh snap guys, do you see what he did there in his parley? The way he took my point and pretended I was saying something else and that I really agreed with him. That technique so got me that he won!
This is most definitely the paragon that should be helping us decide which large swathe of people to fuck over.
I don't disagree with this, but I'm struggling to understand how aiming for zero CVEs would somehow be too onerous a tradeoff when six is reasonable. Assuming that nobody wants to have any CVEs in their codebase, the idea that ending up with six is reasonable but aiming for zero is preposterous sounds like another way of saying "it's easy to accidentally miss six future CVEs in your codebase". If that's the case, how can you have any degree of confidence that by aiming for six, you won't end up with 12 instead?
there's a reason people say things like "actions speak louder than words".
It's easy to say "safety is about tradeoffs" but then when you follow it up with an insistent that no tradeoffs should be made it kind of makes it seem like you're just saying that to appear reasonable rather than actually being reasonable.
> Not to mention the weird conclusion that since no language has 0, that isn't the goal. I'm not sure I understand the logic that you shouldn't at least _try_ to not have any major security flaws.
He addressed that, the cost of making it to 0 would be too great (C++ would have to break backwards compatibility) so we should try and be inline with other languages instead.
I don't understand why you're acting as if he didn't make the point he made.
> He addressed that, the cost of making it to 0 would be too great (C++ would have to break backwards compatibility) so we should try and be inline with other languages instead.
> I don't understand why you're acting as if he didn't make the point he made.
My confusion is that I'd expect breaking backwards compatibility to either be completely off the table or for the amount of breakage allowed to be up for debate. If you're not willing to break compatibility at all, I feel like the goal should be to shoot as low as possible without breaking anything; if it's possible to get as low as other languages, why stop there? If you're willing to sacrifice some backwards compatibility, why not be willing to break it a little more to eliminate the last few sources of unsafety?
it's not clear to me that you read or understood the article, all of your posts certainly feel as if you didn't.
He explained why 0 isn't the goal, you continue to act as if he didn't. I don't know where else this conversation can go without you going back and better understanding his actual point.
If the discussion requires that I find his explanation convincing rather than being able to think that it's not sufficient, then yeah, I guess there's nowhere else for it to go.
which makes sense, if someone made an impression on you that impression doesn't disappear in just a few days. At best it may be fuzzier, which could be good or bad.
The point Herb was making is that "rigorous memory safety" isn't the only bar, nor should it be. Saying there is no way to make C++ have rigorous memory safety is not the same as saying C++ can never be made safe.