I'm not sure why this is confusing? We're seeing the phenomenon everywhere in culture lately. People WANT something to be true and try to speak it into existence. They also tend to be the people LEAST qualified to speak about the thing they are referencing. It's not marketing hype, it is propaganda.
Meanwhile, the 'experts' are saying something entirely different and being told they're wrong or worse, lying.
I'm sure you've seen it before, but this propaganda, in particular, is the holy grail of 'business people'. The ones who "have a great idea, just need you to do all the work" types. This has been going on since the late 70s, early 80s.
Not necessarily confusing but very frustrating. This is probably the first time I encountered such a wide range of opinions and therefore such a wide range of uncertainty in a topic close to me.
When a bunch of people very loudly and confidently say your profession, and something you're very good at, will become irrelevant in the next few years, it makes you pay attention. And when you then can't see what they claim to be seeing, then it makes you question whether something is wrong with you or them.
Totally get that; I'm on the older side, so personally I've been down this road quite a few times. We're ALWAYS on the verge of our profession being rugged somehow. RAD tools, Outsourcing, In-sourcing, No-Code, AI/LLM... I used to be curious about why there was overwhelming pressure to eliminate "us", but gave up and just focus on doing good work.
The pressure is simple - money. Competent people are rare and we're not cheap. But it turns out, those cheaper less competent people can't replace us, no matter what tools you give them - there is fundamental complexity to the work we do which they can't handle.
However, I think this time is qualitatively different. This time the rich people who wanna get rid of us are not trying to replace us with other people. This time, they are trying to simulate _us_ using machines. To make "us" faster, cheaper and scalable.
I don't think LLMs will lead to actual AI and their benefit is debatable. But so much money is going into the research that somebody might just manage to build actual AI and then what?
Hopefully, in 10 years we'll all be laughing at how a bunch of billionaires went bankrupt by trying to convince the world that autocomplete was AI. But if not, a whole bunch of people will be competing for a much smaller pool of jobs, making us all much, much poorer, while they will capture all the value that would have normally been produced by us right into their pockets.
I agree; I wasn't clear in my previous post. I understand the economic underpinnings. I cannot understand the coupled animus and have stopped trying.