Technically lesswrong is about rationalists not effective altruists, but you're right in a sense that it's the same breed.
They think that the key to scientific thinking is to forego the moral limitations, not to study and learn. As soon as you're free from the shackles of tradition you become 100% rational and therefore 100% correct.
Approximately no one in the community thinks this. If you can go two days in a rationalist space without hearing about "Chesterton's Fence", I'll be impressed. No one thinks they're 100% rational nor that this is a reasonable aspiration. Traditions are generally regarded as sufficiently important that a not small amount of effort has gone into trying to build new ones. Not only is the case that no one thinks that anyone including themselves is 100% correct, but the community norm is to express credence in probabilities and convert those probabilities into bets when possible. People in the rationalist community constantly, loudly, and proudly disagree with each other, to the point that this can make it difficult to coordinate on anything. And everyone is obsessed with studying and learning, and constantly trying to come up with ways to do this more effectively.
Like, I'm sure there are people who approximately match the description you're giving here. But I've spent a lot of time around flesh-and-blood rationalists and EAs, and they violently diverge from the account you give here.
So much vitriol. I understand it's cool to hate on EA after the SBF fiasco, but this is just smearing.
The key to scientific thinking is empiricism and rationalism. Some people in EA and lesswrong extend this to moral reasoning, but utilitarianism is not a pillar of these communities.
Empiricism and rationalism both tempered by a heavy dose of skepticism.
On the other hand, maybe that is some kind of fallacy itself. I almost want to say that "scientific thinking" should be called something else. The main issue being the lack of experiment. Using the word "science" without experiment leads to all sorts of nonsense.
A word that means "scientific thinking is much as possible without experiment" would at least embedded a dose of skepticism in the process.
The Achilles heel of rationalism is the descent into modeling complete nonsense. I should give lesswrong another chance I suppose because that would sum up my experience so far, empirically.
EA to me seems like obvious self serving nonsense. Hiding something in the obvious to avoid detection.
That community is basically the "r/iamverysmart" types bringing their baggage into adulthood. Almost everything I've read in that sphere is basically Dunning–Kruger to the nth degree.