mckn1ght 1 day ago

I'm a fan of saying there's always a slippery slope, it's just a matter of the parameters.

But, I think that it's misguided to apply the human problem of othering to a given technology. Regardless of technology X, humans are gonna human. So, if X helps some people, we should consider it on that basis. Because without X, we will still have an endless stream of other reasons to draw red lines, as you allude to. Except in addition we'll also still have the problem that X could've helped solve.

If gene editing to cure diseases leads to a future where people want to shunt off the poor that are now the outsized burden of the healthcare system, the answer from where I sit is to find ways to make the gene therapies available to them, not to cart them off to concentration camps while they await extermination. This will require all the trappings of human coordination we've always had.

Preventing X from ever coming to fruition doesn't at all prevent all possible futures where concentration death camps are a possibility. To me they are orthogonal concerns.

Even if you can convince one culture/society not to do it, how do you stop others? Force? Now you have a different manifestation of the same problem to solve. Society needs to learn how to "yes, and..." more when it comes to this stuff. Otherwise, it's just war all the way down.

1
sfink 11 hours ago

I mostly agree. Well:

> This will require all the trappings of human coordination we've always had.

It is also true to say that we've never had it as quickly as it has been needed, and neither is it done as well as it needs to be. We will blunder into things that are easy to predict in advance if we are willing to look and accept what we see, but we won't.

I absolutely agree that this advance is a great thing and should be pursued further. But I also think that simply categorizing it as good or bad is a way to willfully ignore the unintended consequences. We should at least try to do better.

> Society needs to learn how to "yes, and..." more when it comes to this stuff.

Absolutely. I just think that requires nuance, wide open eyes, and acceptance of uncomfortable truths. Part of the nuance is not boiling it down to a yes/no question of "should this proceed?" (For example, how about: "How can we utilize these new capabilities to maximize benefit and minimize harm, when the only real lever we seem to have to work with is the profit motive? Everything else is getting undermined in its service.")