while rust probably wouldn't have caught this bug, it is worth noting that the class is memory bugs C and C++ have are especially pernicious. bugs caused by manual memory allocation in these languages are often highly exploitable, and come from a common and necessary part of the language. As a result you end up with somewhere around 70% of security vulnerabilities being caused by a feature that just doesn't exist in other languages. the reason Rust gets so much hype is that it removes the last excuse of C/C++ (GC time).
But do your users prefer C? For software engineering to be a professional discipline, we really need to make choices based on the product outcome rather than based on what feels fun to program.
I imagine they do. If they didn't, they wouldn't be using it. They would instead be using some equivalent piece of software written in a trendy language.
The large majority of users cannot make well informed security risk decisions like this. Engineers should do the right thing and help these users. In the same way, I can't make some meaningful risk assessment for using a bridge or riding an elevator. Civil Engineers don't just get to say "well, if users are concerned about the safety of my bridge then they can make a different choice so I'm going with the stuff I personally like working with." Why do Software Engineers get away with this?
Outside of small hobbyist projects, the industry has an obligation to provide users with safe software.
1. Start new projects intended for production that have nontrivial security threats in C or C++
2. Not have a plan to categorically prevent memory safety errors in legacy codebases over the next decade or so, whether that be by transitioning to new languages or by applying rigorous hardware-level memory tracking
Yeah can you imagine if it actually mattered if companies made choices that they knew were inevitably going to lead to zero-click exploits on internet-enabled devices? Somebody sitting down to write a media decoder in C today knows that this means a steady stream of exploits harming their customers.
I don't think your argument is all that strong. Memory bugs are not equivalent to vulnerabilities. To get to a security vulnerability you first need a logic bug that makes incorrect assumptions about the program state, the hardware, the API, the input, or something else.
Also, what is so bad about crashing bugs? To me, as a programmer, they're very good news. It means you found a bug, you (hopefully) have a crash/log to analyze, and more importantly it means the program didn't just silently continue executing and corrupt state/data without anyone knowing.
Memory bugs are often vulnerabilities. Efforts to classify them as “very unlikely to be exploited” almost always end up turning into “someone with sufficient interest can exploit this”.
Most languages other than C/C++ (eg rust) do bounds checks by default. Also, C is pretty much the only language that uses null terminated strings which are a very common source of overflow.