Teens and Online “Eraser” Laws: Good Intentions, but the Wrong Approach?
Utah may soon be the second state to give teenagers the right to a fresh start online. State Senator Todd Weiler stated that he will introduce a bill to allow teens to erase their social-media profiles and data—by giving the under-18 crowd a statutory right to request that Facebook, Instagram, Twitter, and other chat sites delete their data permanently.
Politicians point to the sealing of juvenile criminal records as another example of erasure. If we can seal those records, then why not delete other adverse or embarrassing information from a person’s youth? The goal is to ensure that the person is not judged based on those past indiscretions when applying for colleges or for a job.
Why is this new legal move occurring? The information that teens post can be embarrassing, and often leads to regrets once teens become a bit older and wiser and realize that, although they now regret the information they had posted, it is coming back to haunt them. That information—which may initially have been posted purely for social purposes—can be used in the future in different contexts, such as for decisions relating to employment, credit, renting an apartment, or applying to college. It’s unlikely, for example, that a 17-year-old at a keg party knows anything about credit reporting and how an ill-advised photo might lead to the denial of a lease or, of a job.
In this column, I will examine the first law to provide a right to erase data. I will also examine the underlying theory behind the establishment of such a right, and connect it to the way in which we historically have allowed minors to void their contracts. Hence, the right to the erasure of posts in the online context mirrors a previous social policy that holds that we have to treat children and teens differently than we do adults. While completely deleting a teen’s data may be impossible, perhaps policymakers should focus, instead, on how such social-media data can be used in the future. Rather than erasing such data, it may be prudent to determine that data captured from when you are younger than 18 cannot be used as part of certain decisions that may be made about the young person at issue—namely, those decisions pertaining to credit, employment, and other determinations covered by the Fair Credit Reporting (FCRA).
The California Example
The purpose of a social-media law similar to the one described above is to allow for anyone to request that his or her social-media profiles be scrubbed before he or she turns 18. In that event, social-media sites would have to delete the information from their servers
In the Fall of 2013, Governor Jerry Brown signed into law the “eraser” bill, entitled Privacy Rights for California Minors in the Digital Age, which sought to make it easier for younger social-media users to delete their embarrassing social-media histories. The California law applies only if you are a minor—so, under 18. It was passed into law to help protect the employment prospects of students and recent high school or college graduates.
At the same time that California’s law was enacted, however, teens may be shunning Facebook and shifting onto private, closed networks. A 2013 Pew Internet study found that most teens had “waning enthusiasm for Facebook,” while the new personal, ephemeral photo-sharing service SnapChat serves 350 million photos per day.
And Facebook is now actually taking an approach opposite to that of smaller, closed or more private networks. Facebook recently announced that the 13-to-17 year-old crowd could make completely public posts—likely, much to the chagrin of their parents. Previously, the over-13 crowd could share status updates, photos and other content only with their friends or friends of friends. The Facebook change occurred weeks after the California law passed, perhaps giving legislators an “I told you so” feeling about their work.
Minors, Ordinary Contracts, and the Right of Disaffirmance
In virtually all states, unmarried persons under the age of 18 are permitted to enter into any contract that an adult can enter into, provided that the contract is not one that is prohibited by law for minors (e.g., a contract to buy cigarettes or alcohol). However, contracts entered into by minors are generally voidable by the minor. What exactly does that mean? A minor may disaffirm a contract; this means that he or she renounces their contractual obligation.
After a contract is disaffirmed, it is to be treated as if it never existed, and any consideration (defined in the law as a thing of value) that changed hands is to be returned to the minor. And the minor needs to do the same. But the contract is, in essence, deleted and voided. The reasoning behind allowing minors to choose when to disaffirm their contracts is that minors, it is assumed, will only chose to disaffirm contracts that are detrimental to their interests.
The privilege to disaffirm, however, is not unqualified. For example, states generally agree that minors may not disaffirm contracts for the necessities of life (e.g., clothing, food, or medical care).
Will the Erasure Policy Work?
Many privacy advocates and technology exports point out that it is challenging to truly delete any data or images once they are posted online or shared. Just because someone scrubs his or her own profile, he or she is not out of the woods yet. If the minors have shared anything with others, that material remains available on the Internet and may be stored by other friends or foes. Retweets, reposts, photo sharing, and the like all may lead to multiple copies of the same information being passed on. Thus, the law may not be able to achieve its main purpose: to obliterate past data and to give teens a fresh start—in the same way bankruptcy courts can clear the slate of debtor
But the real challenge is not only that information and images linger. Instead, it is the fear of how such information is used that worries legislators and grownups. Teens do not necessarily know about the future consequences they may face. If they have not yet applied for a job, or applied for insurance or credit, then it is hard for them to understand that embarrassing posts or images might end up costing them these kinds of opportunities or benefits in the future. This is about context. Teens (and even adults) are not always conscious of how data may be repurposed and used in completely different contexts. This is the underlying impetus for the California law.
Meanwhile, to make things even more complicated, there are new types of companies that have cropped up to compile and catalogue our social-media activities—to be used in new contexts such as insurance applications and employment screening. The company Social Intel, for instance, collects publicly available social-media information and keeps it on file. And data can legally be kept for up to seven years under the Federal Credit Reporting Act. So if you misbehave at 15, evidence of this could be retained by a data broker and used by it until you are 22—and so on.
Social Intel notes on its website that “[job] [a]pplicants social media provides valuable information that can improve hiring results; however employers are caught in a catch-22. Social Intel also notes that “in an era where national security is of the utmost importance, leveraging new tools and data sources including social media is key to mitigating known and unknown threats.” So, upon that logic, a teen’s data becomes relevant not only to employers, but to the government as well.
As I noted above, the use of the social media-data is the underlying issue that concerns legislators. As a result, it may be prudent for legislators to examine whether to amend existing laws such as the FCRA—or in other ways to prohibit the use of new types of data in important adult decisions such as credit and employment screening. And of course there is likely to be larger debate around the parameters of such prohibitions—should all teen data be out of bounds? Should evidence of illegal activity, for example, be barred from further use?
The FCRA, for example, does not ask companies and data brokers to erase consumer profiles: They still exist, and may be used for other purposes such as for targeted marketing) but it does regulate how certain data can be used in the credit context—and we have prohibited the use of certain types of adverse information in decision-making. States and cities, for example, prohibit the use of certain criminal convictions in employment decision-making, for example. We also don’t allow employers to ask for certain types of information from employees–such as their marital status or their religious beliefs. Of course, it may be possible to find out about such things despite the prohibition on asking about them—but the law tells employers what they are prohibited from doing, and provides remedies when these things nonetheless occur.
Thus, it may be that the new generation of laws do not provide a right to erase data, but instead focus on how a teen’s data is used. And many consumers may find themselves saying that, if teens have this right to be forgotten, then maybe, we adults should too. In the EU, this is precisely the issue that is being tested and debated at the moment, as I have described in a prior column.