ninetyninenine 1 day ago

[flagged]

3
mckn1ght 1 day ago

I think part of the problem is that “really good” and “really bad” are not universally accepted norms for any given ethical question. What you’re seeing is your own value system assumptions being checked.

It’s perfectly reasonable to say that while a technology has the propensity to be used for evil, it also has positive applications and that the real benefit now outweighs the potential downside in a hypothetical future.

Otherwise you will go down a rabbit hole at the bottom of which lies a future where we all just kinda dig in the dirt with our hands until we die because every technological innovation can be used in a variety of ways.

Like, it’s silly to me that I can’t bring a 1.5” blade keychain utility knife on a flight, and then they hand me a metal butter knife in first class. I could do way more damage with that. But they allow the butter knife because the utility has shown to far outweigh the potential downside that hasn’t manifested.

> I will slaughter a baby if I know for a fact that baby will grow up to be the next Hitler

This is one of those things that is easy to say precisely due to the impossibility of it ever actually becoming a real decision you have to make.

ninetyninenine 1 day ago

>This is one of those things that is easy to say precisely due to the impossibility of it ever actually becoming a real decision you have to make.

It's true. But things like this should be easy to say right? Like we may not be able to act logically. But we should be able to think logically, communicate logically and show that we are aware of what is logical.

My post got flagged meaning a lot of people can't separate the two things. So for example I may not be able to kill the baby in reality, but I can at least see how irrational I am.

The person who flagged me likely not only can't kill the baby. He has to construct an artificial reality to justify why he can't kill the baby and why his decision must be rational.

mckn1ght 1 day ago

I think things like that should be easy to say, if by that you mean censorship. Sure. But talk is cheap. And there are gradations of reality.

It would maybe be easier for a 15-25 y.o. to kill a baby they don't know and whose parents/family they don't know, and maybe even easier if they don't speak their language or look like them. Of course, the baby wouldn't be the only one you'd have to kill, most likely.

I submit that it would be very very different if you found out that your 4 year old child was going to go on to be the next Hitler. For a "normal" person, I think they would go to the ends of the earth to try to shape them into the kind of person that wouldn't do it. I think very few people would coldly calculate "welp, guess I gotta execute this adorable little child I've grown so attached to" as it looks up at them saying "I love you so much forever, mommy/daddy" with their little doe eyes.

(ETA: it also brings up side questions about nature vs nurture and free will)

And then consider the lifelong repercussions of the emotional fallout. You can use all the logic in the world to justify the action, but your human brain will still torment you over it. And likely, most of the other human brains that learn about it would torment you as well.

---

So, while I think you can say things like that, ie the ability and allowance, I think you should question whether you should. I think saying those kinds of things really doesn't add much to the discussion because I believe it's really just an uninformed platitude that only someone with a lack of life experience would believe.

For me this all highlights the fact that meaty ethical questions don't have a simple reductive answer. Which ties back in to the original problem that OP outright states that this is simply and clearly the wrong path to go down.

(PS the downvoting/flagging could be due to breaking the guidelines around discussing downvotes and flags, and not actually due to the topical content of the posts, and/or assuming bad faith on the part of other users as such: https://news.ycombinator.com/newsguidelines.html)

ninetyninenine 15 hours ago

>So, while I think you can say things like that, ie the ability and allowance, I think you should question whether you should.

You should because many choice in life are not strictly black and white. Saving a babies life versus introducing gene editing to humanity. If there was a baby where we knew he would grow up to slaughter millions it's absolutely worth talking about. In the age of AI and gene editing where things are influencing what it even means to be human, it is wise to stop and pause for a minute to ask the right question rather then charge forward with change that can't be taken back all because we wanted to save a baby.

cooper_ganglia 1 day ago

The anti-eugenics guy just said he would "absolutely" murder a baby...?

ninetyninenine 1 day ago

I would if I can foresee the future. But with eugenics you can't foresee the future. Self artificially selecting for genetic traits doesn't guarantee a good future. There's no gene for recreating Hitler either.

psychoslave 1 day ago

[flagged]

ninetyninenine 1 day ago

A baby Hitler gaurantees a future with a grown up Hitler. Killing the baby eliminates that future.

There could be other babies that can also grow up to be future Hitlers. So let's say 4 such babies exist. By killing one I eliminated 1/4 for futures with Grown up Hitlers that exist.

This whole thread is getting flagged. Likely by an irrational parent who can't even compute natural selection, babe, and Hitler all in a single paragraph.

NoMoreNicksLeft 1 day ago

[flagged]