C++ is on the trajectory to create a future with more safety. Should we do profiles or static lifetime checking (or something else??) is still an open question (and both may be valid). However I'm glad c++ is thinking about that. We have real problems around safety in the real world and people are writing unsafe code even when modern safe code would be easier to write.
Of course it remains to be seen how this all plays out. Static lifetimes can be done good or bad. Profiles can be good or bad. Even if whatever we come up with is done well that doesn't mean people will (I know rust programmers who just put unsafe everywhere).
Profiles are vaporware. The C++ folks are pushing a fantasy of "full memory safety with no changes to existing code, not even annotations to enable sound static analysis." That's just a non-starter, there is no way to get to full memory safety from there unless you count very silly things like making "delete" and "free()" a no-op - and also running everything in a single thread for "concurrency safety".
The only way to get anywhere is provide a path forward. I have a lot of C++98 code that has been working just find for 14+years (that is since before C++11). It isn't worth changing that unless we discover a bug in the code (after 14+ years unlikely) or we need to add new features (if we haven't in 14+ years we probably won't need a new feature there anytime soon). Code I write today is the latest C++. What I really want is a way to say don't write the bad things today, but still allow that old code to work. That is what profiles promises to me. Sure we will never to get full memory safety that way, but that isn't my goal, I just want to make my new code better, and when I come back to old code improve that too.
The case for "100% Safe C++" is that you might be able to annotate that old C++98 code in ways that don't otherwise alter its semantics, but still ensure safety. That would be a one-time cost that might be well-worth paying if the cost is low enough - Where "cost" depends on developer experience as opposed to mere volume of annotations. A "viral" compiler feature that auto-surfaces all the places that will need annotation for a given level of safety has the potential to be quite easy to learn and use effectively. It's not clear why the C++ folks are rejecting that approach, seemingly out-of-hand.
I have > 10 million lines of C++ that is not annotated. There are many projects much larger than mine. If you cannot automatically annotate the code there is no point in trying as you can't do it manually. If you can automate it why not just build that into the compiler and skip the syntax?
> If you cannot automatically annotate the code there is no point in trying as you can't do it manually.
How can you know this without a "viral" analysis that tells you how much annotation is needed, and where? Perhaps the code factors out all the low-level, "memory unsafe" hacks to its own module, and that can be feasibly annotated. It's just not something we can know in advance.
> Perhaps the code factors out all the low-level, "memory unsafe" hacks to its own module, and that can be feasibly annotated.
While it is theoretically not impossible for that scenario to occur, I'd say it sounds wildly unlikely for anything that can be descried as 'old' code.
I suspect the best case scenario is a "Stone soup". https://en.wikipedia.org/wiki/Stone_Soup
The fantasy is enough to get engagement and once you have engagement you can persuade people to do a "little" extra work to get the full benefits. My mother won't buy the product for $5, but if you tell her that it costs $10 but they're 2-for-1 today, she's going to buy that and feel like she got a bargain.
In terms of actually solving the problem well, it's not even captured in these hypothetical regulatory requirements. What you actually want is a safety culture, Rust has one, C++ does not, and no technology will change that. From what I can tell nobody at WG21 wants that to change anyway.
> What you actually want is a safety culture, Rust has one
Rust has a safety culture because it involves requirements for Safe Rust that preserve safety while also playing well with modularity and iterative development. If "Safe C++" can enforce similar requirements, we can expect that a safety culture can be sustained there as well.
The technology does not gift you associated culture, and it's worth knowing that even far outside this business because it applies everywhere.
Yes a technology can be enabling, but, it isn't enough to inculcate the desired culture, that has to come from somewhere else. You can't "sustain" something which does not exist.
Actually WG21 ("The C++ Language Committee") illustrates this well in another way. When WG21 was created it was after the Mother Of All Demos, and so after video conferencing exists as an idea, but to be fair to them it was not really practical at the scale needed for WG21 processes at that time. When C++ 98 shipped it was just about practical, although most ordinary people would have needed to travel to some place with appropriate equipment. By this point the IETF is routinely but not yet universally using such technology.
By the time C++ 11 shipped, I have an ordinary job where I worked full time from home, travelling to a physical location only once or twice per month because video conferencing is now such a mudane and ordinary capability as to go unremarked.
Only since the COVID-19 pandemic has WG21 finally adopted the option for attendance without flying around the world several times per year. The technology to do this had existed for decades, but the culture did not exist.
If you have access to the WG21 meeting minutes, it appears the safety discussions of the last meeting were quite entertaining.
Look, we need more than just promises. C++ is charting a future to the past in the most torturously slow process possible, primarily because of absolutely intrasigent performance obsession that won't even admit the possibility of a 1% performance overhead for bounds checks. The C++ steering committee are the real extremists that are holding back the entire software industry because of a sacred cow and a free pass to externalize that cost onto the rest of us in terms of significantly less secure software.
> The C++ steering committee are the real extremists that are holding back the entire software industry because of a sacred cow and a free pass to externalize that cost onto the rest of us in terms of significantly less secure software.
The C++ leadership serves the C++ community, not the entire software industry. You and everyone who disagrees with them are free to use and write software based on other languages, e.g. Java and Rust.
Many in the C++ community wouldn't acknowledge that.
Which is why disabling RTTI, disabling exceptions, creating their own standard library replacement, static analysers forbinding specific language constructs, is such a big deal in some C++ circles.
You can even add nonstandard features to existing compilers!
The neat thing is that once the standard committee learns about this use case, it could get de facto support as existing use!