Ideally they do lose all of that. That’s the root of the problem.
It may include all my friends from primary school and a photos of my late grandma.
(Disclaimer: I'm so old that at 16 I didn't ever had email. Please don't delete all my old stuff.)
> Ideally they do lose all of that. That’s the root of the problem.
Where is the problem with this?
The problem rather is that the user did not create a private backup of the data that he wants to keep.
Possible contact with pedophiles, groomers, etc.
Once the child is over 16, they can add all their real-world friends again.
Could a possible solution there be to use the same language detection platforms used for detecting terrorist activity to also flag possible grooming for human moderator review? Or might that be too subjective for current language models leading to many false positives?
AKA stupid paranoia.
This is far too pat a dismissal of something which happens regularly. You can argue that it’s not frequent enough to justify this action or would happen anyway through other means but it’s a real problem which isn’t so freakishly rare that we can dismiss it.
Discord is for people over 13 years of age in many countries, yet there are many minors there. It is not working.
I’m not saying anything about specific services, only that there is a legitimate concern which can’t simply be dismissed without reason.
I am not sure I meant to reply to you, to be honest. It is an issue but so far the solutions are terrible. Outsourcing parenting to the Government or companies is also meh. I am sure there are parents who know of ways to reduce screen time for their children, it ranges from installing a program that does not let you on a website or start another program until and unless this and that, or take the phone from the kid's hand and go for a walk or study, whatever.