leptons 3 days ago

>That isn't a useful lens

Yet that is the exact "lens" you chose to view fusion with. It absolutely also applies equally to AGI.

> you didn't track the field until recently?

Okay, now you're trolling. You don't know me and are making assumptions based on what exactly??

This conversation is over. I'm not going back and forth when you're going to make attacks like this.

1
Nevermark 3 days ago

My apologies if pressing my point came over too strong. I will take that as a lesson for me.

I have not made any claims that fusion is unachievable. Just pointed out that the history of the two technologies couldn’t be more different.

There is no reason to believe fusion isn’t making progress, or won’t be successful, despite the challenges.

> you didn't track the field until recently?

That was a question not an assumption.

If you are aware of the progression of actual neural network use (not just research) from the 80’s on, or for any shorter significant time scale, great. Then you know it’s first successes were on toy problems, but it’s been a steady drumbeat of larger, more difficult, more general problems getting solved with higher quality solutions every year since. Often quirky problems in random fields. Then significant acceleration with nVidia’s intro of general compute on graphics cards and researcher hosted leader boards tracking progress more visibly. Only recently solving problems of a magnitude that is relevant to the general public.

Either you were not aware of that, which is no crime. Or dismissing it, or perhaps simply not addressing it directly.

If you have a credible reason that continued compute and architecture exploration is going to stop what has been exponential progress, for 40 years, I want to hear it.

(Not being aggressive, I mean I really would be interested in that. Even if it’s novel and/or conjectural.

Chalk up any steam on my part to being highly motivated to consider alternative reasoning. Not an attempt to change your mind, but to understand.)