aleph_minus_one 5 hours ago

> Ideally they do lose all of that. That’s the root of the problem.

Where is the problem with this?

The problem rather is that the user did not create a private backup of the data that he wants to keep.

1
kranner 5 hours ago

Possible contact with pedophiles, groomers, etc.

Once the child is over 16, they can add all their real-world friends again.

LinuxBender 5 hours ago

Could a possible solution there be to use the same language detection platforms used for detecting terrorist activity to also flag possible grooming for human moderator review? Or might that be too subjective for current language models leading to many false positives?

Hizonner 4 hours ago

AKA stupid paranoia.

acdha 4 hours ago

This is far too pat a dismissal of something which happens regularly. You can argue that it’s not frequent enough to justify this action or would happen anyway through other means but it’s a real problem which isn’t so freakishly rare that we can dismiss it.

johnisgood 4 hours ago

Discord is for people over 13 years of age in many countries, yet there are many minors there. It is not working.

acdha 2 hours ago

I’m not saying anything about specific services, only that there is a legitimate concern which can’t simply be dismissed without reason.

johnisgood 28 minutes ago

I am not sure I meant to reply to you, to be honest. It is an issue but so far the solutions are terrible. Outsourcing parenting to the Government or companies is also meh. I am sure there are parents who know of ways to reduce screen time for their children, it ranges from installing a program that does not let you on a website or start another program until and unless this and that, or take the phone from the kid's hand and go for a walk or study, whatever.