TL;DW:
Call GET /cgi-bin/account_mgr.cgi?cmd=cgi_user_add&name=%27;<INJECTED_SHELL_COMMAND>;%27
account_mgr.cgi is safe, it takes web parameters "name", "pw" and calls the equivalent of
execlp(..., "account", "-u", name, "-p", pw);
"account" was written by the intern and runs sprintf(buf, "adduser \"%s\" -p \"%s\" >/dev/null", opt_u, opt_p);
system(buf);
Never mind the actual mistake "the intern" made.
Not only was "the intern" tapped to write code that accepts user input from HTTP and also use system administration shell commands - and use C to do raw string handling, for that matter; who knows if `buf` is properly allocated? - but there was either no review/oversight or nobody saw the problem. Plus there are two layers of invoking a new program where surely one would suffice; and it's obviously done in a different way each time. Even programmers who have never used Linux and know nothing about its shells or core utilities, should be raising an eyebrow at that.
Meanwhile, people want to use AI to generate boilerplate so that their own company's "the intern" can feel like a "10x developer" (or managers can delude themselves that they found one).
That’s insane.
It's also wrong. If the C code presented is accurate the URL would have to contain &name=%22;shell-command-to-run;%22, or perhaps &name=$(shell-command-to-run). name=%27;shell-command-to-run%27 is mostly harmless.
That's nit-picky I know, but when some dude on the internet is trying to get clicks via manufactured rage at incompetent programmers, it's kinda ironic his code is buggy too.