rqtwteye 10 days ago

You have to go lower down the stack. Don't use AI but write the AI. For the foreseeable future there is a lot of opportunity to make the AI faster.

I am sure assembly programmers were horrified at the code the first C compilers produced. And I personally am horrified by the inefficiency of python compared to the C++ code I used to write. We always have traded faster development for inefficiency.

3
EVa5I7bHFq9mnYK 10 days ago

C was specifically designed to map 1:1 onto PDP-11 assembly. For example, the '++' operator was created solely to represent auto-increment instructions like TST (R0)+.

kmeisthax 10 days ago

C solved the horrible machine code problem by inflicting programmers with the concept of undefined behavior, where blunt instruments called optimizers take a machete to your code. There's a very expensive document locked up somewhere in the ISO vault that tells you what you can and can't write in C, and if you break any of those rules the compiler is free to write whatever it wants.

This created a league of incredibly elitist[0] programmers who, having mastered what they thought was the rules of C, insisted to everyone else that the real problem was you not understanding C, not the fact that C had made itself a nightmare to program in. C is bad soil to plant a project in even if you know where the poison is and how to avoid it.

The inefficiency of Python[1] is downstream of a trauma response to C and all the many, many ways to shoot yourself in the foot with it. Garbage collection and bytecode are tithes paid to absolve oneself of the sins of C. It's not a matter of Python being "faster to write, harder to execute" as much as Python being used as a defense mechanism.

In contrast, the trade-off from AI is unclear, aside from the fact that you didn't spend time writing it, and thus aren't learning anything from it. It's one thing to sacrifice performance for stability; versus sacrificing efficiency and understanding for faster code churn. I don't think the latter is a good tradeoff! That's how we got under-baked and developer-hostile ecosystems like C to begin with!

[0] The opposite of a "DEI hire" is an "APE hire", where APE stands for "Assimilation, Poverty & Exclusion"

[1] I'm using Python as a stand-in for any memory-safe programming language that makes use of a bytecode interpreter that manipulates runtime-managed memory objects.

immibis 10 days ago

In the original vision of C, UB was behaviour defined by the platform the code ran on, rather than the language itself. It was done this way so that the C language could be reasonably close to assembly on any platform, even if that platform's assembly was slightly different. A good example is shifts greater than the value's width: some processors give 0 (the mathematically correct result), some ignore the upper bits (the result that requires the fewest transistors) and some trap (the cautious result).

It was only much later that optimizing compilers began using it as an excuse to do things like time travel, and then everyone tried to show off how much of an intellectual they were by saying everyone else was stupid for not knowing this could happen all along.

achierius 10 days ago

You don't need a bytecode interpreter to not have UB defined in your language. E.g. instead of unchecked addition / array access, do checked addition / bounds checked access. There are even efforts to make this the case with C: https://github.com/pizlonator/llvm-project-deluge/blob/delug... achieves a ~50% overhead, far far better than Python.

And even among languages that do have a full virtual machine, Python is slow. Slower than JS, slower than Lisp, slower than Haskell by far.

pfdietz 9 days ago

Common Lisp and Scheme are typically compiled ahead of time right down to machine code. And isn't Haskell also?

There is a Common Lisp implementation that compiles to bytecode, CLISP. And there are Common Lisp implementations that compile (transpile?) to C.

pfdietz 10 days ago

Why was bytecode needed to absolve ourselves of the sins of C?

01HNNWZ0MV43FF 10 days ago

The AI companies probably use Python because all the computation happens on the GPU and changing Python control plane code is faster than changing C/C++ control plane code