Well, let's put it like this:
- Webkit, GCC, and a few others are non-trivial C++ codebases that are (I argue) useful.
- In your experience, since they are non-trivial, they have silent memory corruption bugs (i.e. they are not "perfectly safe").
Does this answer the "why bother with software at all" question?
Webkit, as I understand it, is not really a C++ codebase built with a popular compiler, it's a codebase that follows its own significantly stricter standards and has a lot of additional tooling to avoid bugs.
And I'd say that even with all that additional effort, it has a level of bugs that's not "fine". Indeed, per the article, I suspect that the maintainers of Webkit are some of the people pushing to make C++ more Rust-like.
Webkit TBH wasn't a great example, since it's arguably a piece of software that would benefit from being developed in Rust. That said, the point is that we don't need "one language to rule them all" - C++ has made some tradeoffs, that will not be ideal in all circumstances/for all projects. Trying to change the tradeoffs because of a handful of projects (like Webkit) would be better suited to new tradeoffs is not necessarily the right choice for the language itself, or its community of users. Things are not so simple, "There are 2 factions of C++, those that agree with me and are right, and those that disagree and are wrong".
I think silent memory corruption is almost never a good tradeoff. (The one possible exception is something like a single-player videogame, where unknown corruption might be less bad than crashing - but even then, avoiding having the situation come up in the first place is better). An argument used to be made (if not in so many words) that accepting a certain amount of occasional memory corruption was a necessary tradeoff for performance; it's an argument that I was always dubious about, and now Rust has proven it completely false.
Fundamentally I don't think this is a case where C++ makes a deliberate design tradeoff that makes sense for some projects. I think it's just a bad design choice (not even a choice as such - it wasn't a question that was considered at all when C++ was first designed) that should be corrected. Sometimes there is a right answer.
> Sometimes there is a right answer.
Indeed. And when that "right answer" comes along, it tends to swipe away everything else. If it's universally better, why wouldn't it?
Except that, Rust does not do that. Which is a hint that it's not an "universally right answer", but a right answer for a subdomain of problems. That's basically what I was trying to say. That it does come with its own tradeoffs/downsides.
(maybe I'm wrong and it's only a matter of time until that happens; but I don't think so.. it's been a while, there was time for it to make the impact. Lifetime annotations are not yet adopted by any other mainstream language, AFAIK)
> Indeed. And when that "right answer" comes along, it tends to swipe away everything else. If it's universally better, why wouldn't it?
> Except that, Rust does not do that. Which is a hint that it's not an "universally right answer", but a right answer for a subdomain of problems. That's basically what I was trying to say. That it does come with its own tradeoffs/downsides.
Rust may not be the only right answer, but memory unsafety is the wrong one. New projects overwhelmingly pick memory-safe languages, governments and organisations are banning memory-unsafe languages at least for new projects. I don't think anyone is picking C++ at this point if they don't already have a big sunk cost invested in it (even if that cost is just their personal programming experience).
> Lifetime annotations are not yet adopted by any other mainstream language, AFAIK
Linear Haskell is getting there, but most languages aren't flexible enough to retrofit lifetimes (or at best it would be a multi-year effort, like adding types to Python) - as we're seeing in this whole C++ discussion. Also non-GC languages are niche in the first place, and the problem lifetimes solve is a lot less urgent in a GC language. I don't think any post-Rust language has hit "mainstream" yet (we only really get a couple of new mainstream languages a decade), so we'll see what happens in the future.
Most C++ developers care greatly about the quality of their code, and suggesting that since the code isn't in a life threatening situation like a flight controller or medical device it can be buggy with no repercussions is pretty silly.
Your examples of GCC and Webkit are both projects that have spent enormous amounts of effort to be as memory safe as they can be, and have both had many memory safety related CVEs in the past. As was already pointed out, you still have to pay the cost of engineering memory safe code, even when your compiler/static analysis doesn't have your back.
I was not saying anywhere that people don't or shouldn't care about the quality of their code. I was just pointing out that, whether we like it or not, "quality" is just one of the factors that goes into the mix of "things to optimize of". Other factors like "time" and "effort" and "efficiency" and "compatibility" and even trivial stuff like "familiarity" play a role - or else you'd have formal proofs written in TLA+ or Alloy or the like, before writing any system; And you'd have people immediately switch to safer languages like Rust (which is obviously not happening at scale).
The GCC/Webkit examples were not the best examples, but were nevertheless easily available examples that made one particular point: OP's comment was self-contradictory.
> Most C++ developers care greatly about the quality of their code
Not at our org. Though I know a couple of die hard fans that will eat you for lunch if you do something stupid or ugly.