Webkit, as I understand it, is not really a C++ codebase built with a popular compiler, it's a codebase that follows its own significantly stricter standards and has a lot of additional tooling to avoid bugs.
And I'd say that even with all that additional effort, it has a level of bugs that's not "fine". Indeed, per the article, I suspect that the maintainers of Webkit are some of the people pushing to make C++ more Rust-like.
Webkit TBH wasn't a great example, since it's arguably a piece of software that would benefit from being developed in Rust. That said, the point is that we don't need "one language to rule them all" - C++ has made some tradeoffs, that will not be ideal in all circumstances/for all projects. Trying to change the tradeoffs because of a handful of projects (like Webkit) would be better suited to new tradeoffs is not necessarily the right choice for the language itself, or its community of users. Things are not so simple, "There are 2 factions of C++, those that agree with me and are right, and those that disagree and are wrong".
I think silent memory corruption is almost never a good tradeoff. (The one possible exception is something like a single-player videogame, where unknown corruption might be less bad than crashing - but even then, avoiding having the situation come up in the first place is better). An argument used to be made (if not in so many words) that accepting a certain amount of occasional memory corruption was a necessary tradeoff for performance; it's an argument that I was always dubious about, and now Rust has proven it completely false.
Fundamentally I don't think this is a case where C++ makes a deliberate design tradeoff that makes sense for some projects. I think it's just a bad design choice (not even a choice as such - it wasn't a question that was considered at all when C++ was first designed) that should be corrected. Sometimes there is a right answer.
> Sometimes there is a right answer.
Indeed. And when that "right answer" comes along, it tends to swipe away everything else. If it's universally better, why wouldn't it?
Except that, Rust does not do that. Which is a hint that it's not an "universally right answer", but a right answer for a subdomain of problems. That's basically what I was trying to say. That it does come with its own tradeoffs/downsides.
(maybe I'm wrong and it's only a matter of time until that happens; but I don't think so.. it's been a while, there was time for it to make the impact. Lifetime annotations are not yet adopted by any other mainstream language, AFAIK)
> Indeed. And when that "right answer" comes along, it tends to swipe away everything else. If it's universally better, why wouldn't it?
> Except that, Rust does not do that. Which is a hint that it's not an "universally right answer", but a right answer for a subdomain of problems. That's basically what I was trying to say. That it does come with its own tradeoffs/downsides.
Rust may not be the only right answer, but memory unsafety is the wrong one. New projects overwhelmingly pick memory-safe languages, governments and organisations are banning memory-unsafe languages at least for new projects. I don't think anyone is picking C++ at this point if they don't already have a big sunk cost invested in it (even if that cost is just their personal programming experience).
> Lifetime annotations are not yet adopted by any other mainstream language, AFAIK
Linear Haskell is getting there, but most languages aren't flexible enough to retrofit lifetimes (or at best it would be a multi-year effort, like adding types to Python) - as we're seeing in this whole C++ discussion. Also non-GC languages are niche in the first place, and the problem lifetimes solve is a lot less urgent in a GC language. I don't think any post-Rust language has hit "mainstream" yet (we only really get a couple of new mainstream languages a decade), so we'll see what happens in the future.