DEV Community

Cover image for Taking Responsibility for Ethical Design
Joe Honton
Joe Honton

Posted on

Taking Responsibility for Ethical Design

We have an ethical dilemma: how can we protect an individual’s personal data from misuse, while still providing a useful experience?

There are aspects to this that need our thoughtful consideration: data as a commodity, data amalgamation without express consent, data archival for indefinite periods of time, the expungement of data for the protection of our youth, data custodianship version ownership, and others like these.

In short, for each new tech project we should be asking: why are we collecting this data? how long do we really need to keep it? and what harm could result from its misuse by us and others?

In days gone by, public domain records were limited to birth, wedding, and death certificates. For anything else, the picture was filled out by family albums and newspaper clippings. Data archiving was limited to what the archivist could squeeze into his cinder-block storage facility. Data retention was fallible: fires, floods, and mold took turns at reducing history to a sketch.

In contrast, today it seems that everything we’ve ever done, said, posted, or photographed, is being captured and archived, unable to fade from memory.

But some things should be forgotten, otherwise our children won’t be able to grow up and experiment and make mistakes. Teenagers need to be protected from themselves. We need to have regulations in place to allow them to expunge the data of their misdeeds, their bad choice of words, their poor judgment with cameras. How else can we protect them from being forever dragged down after they’ve matured?

And we need to have better rules about the ownership of data we generate as a by-product of our tech activities. Who should have ultimate control over the ownership and retention rights of such data, the individual, or the business that stores it? Who really owns it? If internet companies are custodians of that data, what responsibilities do they have to the people whose lives are being recorded?

We need to develop a social contract, where tech companies and the users they serve, agree that personal data is not a commodity. At a minimum it should begin with:

  • A non-amalgamation clause in our end-user licensing agreements that restricts or prevents us from joining data that we collect with data that we obtain from third parties without express consent from the user.

  • A data retention clause that prohibits the archival of transaction data for unlimited periods of time without the express consent of the user, and that mandates that we erase those transactions when that period expires.

  • A data collection clause that restricts us from capturing or storing ephemeral data, such as GPS way-points.

  • A data purging clause that specifies how long usage data — such as movies seen, books read, or music listened to — can be kept before mandatory deletion must occur.

  • When we start putting these kinds of guidelines in place, we’ll begin to regain the trust that’s been lost.


Excerpted from Ethical Design in the Orwellian Commons.

Top comments (0)