ChrisRR 6 days ago

It depends what you mean by simple. C still is simple, but it doesn't include a lot of features that other languages do, and to implement them in C is not simple.

C is simple for some use cases, and not for others.

3
acuozzo 6 days ago

> C still is simple

Syntactically, yes. Semantically, no.

There are languages with tons of "features" with far, far less semantic overhead than C.

https://blog.regehr.org/archives/767

FWIW, writing programs in C has been my day job for a long time.

drob518 6 days ago

Exactly. There is a lot happening implicitly in a C program that the programmer has to be aware of and keep in mind. And it’s made worse by valid compile implementation choices. I remember chasing a bug for a day that was based on me forgetting that the particular implementation I was working with had signed characters and was sign extending something at an inopportune time.

kstrauser 6 days ago

C is simple like SMTP.

acuozzo 6 days ago

EXACTLY!

groos 6 days ago

As someone who has had to parse C syntax for a living, I'd argue that it's not syntactically simple either. (Declarators are particularly nasty in C and even more so in C++).

ChrisRR 5 days ago

Entirely my point. Simpler in some ways, more difficult in others. Totally depends on the use case

90s_dev 6 days ago

The appeal of C is that you're just operating on raw memory, with some slight conveniences like structs and arrays. That's the beauty of its simplicity. That's why casting a struct to its first argument works, why everything has an address, or why pointer arithmetic is so natural. Higher level langs like C++ and Go try to retain the usefulness of these features while abstracting away the actuality of them, which is simultaneously sad and helpful.

zokier 5 days ago

> The appeal of C is that you're just operating on raw memory ... why everything has an address, or why pointer arithmetic is so natural

That is just an illusion to trip unsuspecting programmers who have false mental models. Pointers are not addresses, and pointer arithmetic is rife with pitfalls. There is the whole pointer provenance thing, but that's more like the tip of the iceberg.

That is really the problem with C; it feels like you can do all sorts of stuff, but in reality you are just invoking nasal demons. The real rules on what you can and can not do are far more intricate and arcane, and nothing about them is very obvious on the surface level.

windward 6 days ago

A typical C program of useful length typically includes a spattering of implicit type conversions that the programmer never intended or considered. It's the consequence of a feature that abstracts away how the type system and memory really[1] acts.

[1]for certain definitions of 'really'

colejohnson66 6 days ago

> That's why casting a struct to its first argument works

Until WG14 makes everything you love about C "undefined behavior" in the name of performance.

worik 6 days ago

> Until WG14 makes everything you love about C "undefined behavior" in the name of performance.

What do you mean?

I just looked up WG14 and I cannot see what you mean

A link perhaps? Am I going to have to "pin" my C compiler version?

tialaramex 6 days ago

Some people have this idea that when they write utter nonsense it should do what they meant because - ie they're missing out the whole discipline of programming and going straight from "I want it to work" to "It should work" and don't understand what they're doing wrong.

For some of these people WG14 (the C language sub-committee of SC22, the programming language sub-committee of JTC1, the Joint Technical Commitee between ISO and the IEC) is the problem because somehow they've taken this wonderful language where you just write stuff and it definitely works and does what you meant and turned into something awful.

This doesn't make a whole lot of sense, but hey, they wrote nonsense and they're angry that it didn't work, do we expect high quality arguments from people who mumble nonsense and make wild gestures on the street because they've imagined they are wizards? We do not.

There are others who blame the compiler vendors, this at least makes a little more sense, the people who write Clang are literally responsible for how your nonsense C is translated into machine code which does... something. They probably couldn't have read your mind and ensured the machine code did what you wanted, especially because your nonsense doesn't mean that, but you can make an argument that they might do a better job of communicating the problem (C is pretty hostile to this, and C programmers no less so)

For a long time I thought the best idea was to give these people what they ostensibly "want" a language where it does something very specific, as a result it's slow and clunky and maybe after you've spent so much effort to produce a bigger, slower version of the software a friend wrote in Python so easily these C programmers will snap out of it.

But then I read some essays by C programmers who had genuinely set out on this path and realised to their horror that their fellow C programmers don't actually agree what their C programs mean, the ambiguity isn't some conspiracy by WG14 or the compiler vendors, it's their reality, they are bad at writing software. The whole point of software is that we need to explain exactly what the machine is supposed to do, when we write ambiguous programs we are doing a bad job of that.

rini17 4 days ago

The premise "lol who needs memory safety at runtime, you get sigsegv if there's a problem no biggie, lets make it FAST and dont bother with checks" was the original horror. There are enough cowboys around that loved the approach. It's actually not so surprising such mindset became cancerous over time. The need to extract maximum speed devoured the language semantics too. And it is spreading, webassembly mostly inherited it.

SAI_Peregrinus 6 days ago

I've said before that C is small, but not simple.

Turing Tarpits like Brainfuck or the Binary Lambda Calculus are a more extreme demonstration of the distinction, they can be very tiny languages but are extremely difficult to actually use for anything non-trivial.

I think difficulty follows a "bathtub" curve when plotted against language size. The smallest languages are really hard to use, as more features get added to a language it gets easier to use, up to a point where it becomes difficult to keep track of all the things the language does and it starts getting more difficult again.