outworlder 1 day ago

> If you don't have some tool installed, it'll install it.

Terrifying. LLMs are very 'accommodating' and all they need is someone asking them to do something. This is like SQL injection, but worse.

2
nxobject 8 hours ago

In an ideal world, this would require us as programmers to lean into our codebase reading and augmentation skills – currently underappreciated anyway as a skill to build. But when the incentives lean towards write-only code, I'm not optimistic.

rglover 21 hours ago

I often wonder what the first agent-driven catastrophe will be. Considering the gold rush (emphasis on rush) going on, it's only a matter of time before a difficult to fix disaster occurs.