DEV Community

Cover image for Better (decentralized) Regulation and Protocol for Parental Control?
Domi
Domi

Posted on • Updated on

Better (decentralized) Regulation and Protocol for Parental Control?

There seem to be more and more discussions on and revalations about how some corners of social media are downright dangerous for kids. This makes me have a few reactions:

  1. That's awful, and:
  2. Oh no... politicians are going to pass non-sensical regulation on a topic they don't understand.
  3. But... I like to browse anonymously... I don't want services to be required to verify someone's identity before being able to use it. But I can definitely see it happen, especially in Europe.
  4. And most importantly: I don't want to re-enact 1984, like they do in China, where facial recognition and id cards must be provided to use all kinds of online services (such as games), so companies (and the gov) can prevent everyone kids from accessing the internet as a whole or just certain content. Also, I don't trust anyone to keep that type of data safe ever.

So I was wondering... If this problem needs to get tackled, can we find a better solution, before panicked politicians take a sledgehammer to solve the problem?

What about instead of trusting services to perform identification verification, can we somehow leave it to the parent?

To that extent, considering the immense amount of resources that identity verification would cost to begin with, can't that money be used to develop a better parental control mechanism and experience instead? Maybe one that is tied to the computer/phone/device (or even network of devices if they have more than one), rather than being enforced by the provider/company/gov?
I'm thinking, why not implement an open source protocol that works as follows:

  1. The privileged parent can turn on and off certain types of content (e.g. by domain/url/"content flags"/even keywords; e.g. nsfw, partial nudity, PG-13 could be "content flags").
  2. Applications and content providers can opt into that protocol, to check whether something may be accessed or not.
  3. There would be sensible and maybe crowdsourced default settings, so parents don't have to make the hard decisions on their own (that is similar to how ad-block works afaik)

E.g. the browser implements the protocol as a "hub"-actor, and websites that want to offer any non-safe or unmoderated content must (over the time frame of implementation, let's say a few years or so) implement the "content provider" actor of the protocol, which communicates to the hub the type of content (could start out as simple as adding a new HTTP header, or maybe an initial handshake whose result stays cached for a while etc.), to check if it should be served.

It's just a rough idea that I have been having, but it certainly sounds more appealing than waiting for the next (much worse) iteration of the "Do you want to allow our cookies?" bull.

Top comments (1)

Collapse
 
garryhammack profile image
Danny Acton

Yeespy always takes its job very seriously as a Parental Control, and this is evident in its efforts. The Yeespy parental control software suite features one of the most comprehensive set of features and their is easier to get know that how can i monitor my child's text messages on iphone with Yeespy. Thanks!