@edwiebe@mstdn.ca @dalias@hachyderm.io @Em0nM4stodon@infosec.exchange From what I understand, active verification does necessarily invade privacy.
But active verification is not necessary.
A mere social media ban under age X, if necessary, could simply be passed as a law, making the parents responsible for ensuring their children follow it. There already are existing laws of this kind for other areas of life. And as parents are responsible for supervising their children, they definitively can also be responsible here.
The opposite is true as well - while the child is supervised by their parents, such restrictions should not apply.
To support the ban, I still think it'd be useful to have an (optional at parents' discretion) software solution. Sure one could go all allowlist using e.g. Google Family Link, but I'd prefer if sites specified their purpose (and also some other properties, e.g. the severity of various kinds of NSFW content, potentially even at multiple levels of which the client can then pick one and specify in a header) for such software to use. That's trivial to do, it's just one file to be placed in the web server's root and it'll work. Could store it in DNS instead, whatever, don't care.
Furthermore, while at it, we could combine this with a technical solution for COPPA and other regulations that ban tracking and surveilling children online. Namely, revive Do-Not-Track, and have aforementioned software automatically set the header for minors.
But, I hear Big Tech say, then what if adults set the header too?
Then you don't effing track them either.
But... what if everyone sets it?
Then the people have spoken.