Next thing, LLMs that review code! Next next thing, poisoning LLMs that review code!
Galaxy brain: just put all the effort from developing those LLMs into writing better code
Man I wish I could upvote you more. Most humans are never able to tell the wrong turn in real time until it's too late