Last week, U.K. Prime Minister David Cameron dropped a bombshell, when he announced that all British households would have their access to online pornography blocked unless they choose to “opt in” and allow Internet porn access. By the end of 2014 year, households will have to accept or reject an automatic porn filter that the government will require Internet Service Providers (ISPs) to deploy. Cameron is betting that most households will stay with the default option of blocking porn access, and keep the filter running—due to either inertia or embarrassment.
Cameron’s move is meant to protect “children and their innocence”, and to also prevent the spread of exploitative sexual images over the Internet. No doubt, the goal is laudable—but as many people start to figure out how this Cameron’s plan will be implemented, it is becoming quite clear that Cameron’s aides have not fully figured out their implementation strategy.
In his new plans, Cameron unveiled measures meant to stop public access to Internet porn. The Prime Minister has repeatedly told the British public that access to online pornography risks “corroding childhood,” evoking images of Victorian paternalism and sensibility. As more details of Cameron’s plans are unveiled, it is unclear whether Cameron’s filtering or government censorship will stop with pornography, or will extend to other types of offensive content.
In this article, I will discuss the basic contours of the Cameron plan, and the potential problems rose by the current proposal. There is no real controversy about the goal of preventing minors from accessing porn, but there is significant controversy about the potential over breadth of the filters that will be used to in attempting to do so.
It is also unclear why a democratic government entity needs to decide what filters to deploy—rather than leaving it to parents and guardians to decide how to restrict children’s access to content. While some governments already filter what kinds of content citizens can access, Cameron’s is the first blanket move of this type that has been attempted in a constitutional democracy that prides itself on freedom of expression. Australia does monitor overseas websites for illegal consent, but not in the sweeping manner suggested by Cameron.
The Cameron Plan
Cameron’s “war on porn” consists of an agreement with the major U.K. ISPs to block access to adult material as a default. That means, in theory that all the devices linked to one household subscription will be prevented from reaching explicit sites unless the subscriber affirmatively opts in and says “yes” I want to see pornography.
The filters will be implemented by the U.K.’s major ISPs, which account for 90% or more of British Internet users. People who set up new Internet accounts or switch subscriptions will be told to opt in or out by the end of this year. These choices will, of course, lead to lots of difficult conversations. Will college students tell their parents, or spouses tell their loved ones, that they want to keep access to porn and need to “opt in”? Or will Cameron’s theory work? Will people stick with defaults out of inertia and therefore live in a porn-free world?
Cameron has made clear that all the ISPs will configure their own services so that filters will be deployed on any computer or other device linked to a subscriber’s account.
Where Will the Line Be Drawn? The Plan That Has Not Yet Been Thought Out
Cameron is facing much dissent. There are lots of questions about how his plan for default online porn filters would work in practice. His own lack of certainty was revealed after he suggested that topless female images, such as those used in the Sun newspaper’s infamous page three, would be still being visible online. PM Cameron admitted that there would be “problems down the line” with the system—and appeared to rule out “soft” and written pornography from the scheme entirely. He noted, for example, that the erotic novel Fifty Shades of Grey, would likely still be accessible online. Between topless Page Three images, and explicit literature, it is unclear how Cameron would draw a line as to what can be accessed and what would be blocked.
As a result, it is unclear how Cameron or anyone else implementing the new requirement will be able to filter content effectively—as someone must decide what constitutes hardcore versus soft core pornography. No wonder Cameron’s proposals have been criticized by anti-censorship groups, who warn that sites about public health and sexuality could inadvertently get blocked. Will teens and adults be able to access sites that are focused on sex education, for example? Will websites discussing contraception be blocked as well if they contain diagrams that depict intercourse?
In interviews after a recent speech, Cameron seemed uncertain exactly which sites should be blocked by filters—and seemed to concede that filtering technology still had weaknesses. Speaking on a BBC news program, Cameron said what would be included in the filters would evolve. “The companies themselves are going to design what is automatically blocked, but the assumption is they will start with blocking pornographic sites and also perhaps self-harming sites,” he said.
“It will depend on how the companies choose how to do it,” Cameron has also noted. “ It doesn’t mean, for instance, it will block access to a newspaper like The Sun, it wouldn’t block that—but it would block pornography.”
Cameron has also admitted, “I’m not saying we’ve thought of everything and there will be many problems down the line as we deal with this, but we’re trying to crunch through these problems and work out what you can do and can’t do.”
Unfortunately for Cameron, filtering pornography is difficult to do with precision. Although the technology has improved greatly, filters that were set up in hospitals several years ago had to be turned off after physicians were unable to access clinical studies on breast cancer. The major challenge for such filters is the host of types of legitimate content that could get swept into the larger net. For examples of legitimate content that could be erroneously filtered, think, for instance, of content focused on public health and medicine, getting blocked because of sexual language or diagrams. And, if the ISP at issue is deploying a filter, a customer may never know that certain sites were blocked and thus secretly made inaccessible to them.
Based on conversations with several ISPs, the British nonprofit Open Rights Group announced recently that the new Cameron controls would likely extend beyond pornography. By default, the group said, the filters will block access to “violent material,” “extremist and terrorist related content,” “anorexia and eating disorder websites,” and “suicide related websites.” In addition, the new settings may censor websites that mention alcohol or smoking. Thus, Cameron’s advisors need to be much clearer as to the contours of the policy that they will ask ISPs to implement: What will the default actually look like, and how broad will the scope of the filters be. If asked, we each might have our own sense as to what the filtering will entail, and each of us may be wrong in some respects.
Cameron himself seems unclear as to what is and what is not porn under his own proposal—again making his proposal seem half-baked. Some women’s groups may reasonably believe that Page Three of The Sun is pornography—but that’s not a view that Cameron apparently shares. Content filtering may not be the best way to decide what counts as pornography. .
British Teens May Well Figure Out How to Defeat Cameron’s Controls
Yet another problem with Cameron’s plan, more generally, is that it may be the case that teens figure out how to circumvent these controls. Carmon has told ISPs, in a letter that he “ expects customers to be required to prove their age/identity before any changes to the filters are made.” He also adds, “I understand that you will all be implementing “closed-loop” systems which will notify account holders of any changes that are made to the filters.”
Children are clever, however and they may be easily able to intercept mom and dad’s emails in order to delete or accept any change to controls—and to provide ID verification by typing in Mum or Dad’s driver’s license number or credit card number.
The US Experience: Limiting Filtering to Libraries and Schools
The U.S, with its robust free speech protections, has long toyed with ways to keep offensive content off of the Internet. Since the First Amendment allows the publication of sexually explicit material, Congress eventually focused on regulating the effects of speech rather than speech or contents itself. In 2000, Congress enacted the Children’s Internet Protection Act (CIPA). This statute required libraries and public schools to deploy Internet filtering software in order to receive federal financial support. Under CIPA, a school or library seeking to receive federal funds for Internet access has to certify to the U.S. Federal Communications Commission that it has installed technology that filters or blocks material deemed to be obscene, child pornography, or is “harmful to minors.”
The U.S. Supreme Court ultimately rejected First Amendment challenges to CIPA, holding that speakers had no blanket, moderated right of access to public libraries, and that library patrons could also request unblocking of sites if they so chose. The majority noted, for example, that librarians choose books for purchase and curate collections.
Some libraries and schools have rejected federal funding, but most have felt compelled to install filters out of fiscal necessity. Who supplies the software? Private companies provide the technology. These companies compete for market share.
CIPA does permit the disabling of consent filters for adults and, in some instances, for minors who need accesses “for bona fide research or other lawful purposes.”It relies on teachers and librarians to make the ultimate call as to who qualified for an exception. Ultimately, Congress has left the areas of home and private Internet access alone, leaving families to decide for themselves what filters to deploy, if any.
This is not to say, however, that the U.S. is not trying to enlist companies to remove harmful content from the Internet. Policymakers have a rationale desire to protect children from harm on the Internet. This has sparked U.S. government initiatives to impose content-based restrictions on ISPs. Law enforcement often pushes ISPS in the U.S, and elsewhere to launch self-regulatory initiatives aimed at blocking illegal and exploitative content. Concerns over child safety online have focused attention on the potential risks associated with time spent on social networking sites such as Facebook or Backpages.com where minors may come into contact with sexual predators. But the U.S. has tried to draw the line when it come to blocking illegal content versus keeping all sexual content off the Web.
New Revelations of a Possible Link to Chinese Technology
In the past few days, as reporters have focused on the Cameron anti-porn crusade, new revelations in the media related to this topic have taken a surprising turn. It turns out that a filter that the U.K. is considering using has links to the Chinese company Huawei. Cameron praised British ISP Talk on Monday for showing “great leadership“ in protecting British kids with its “Home Safe“ service. This service allows subscribers to filter and exclude X-rated content. However, Cameron failed to note that Huawei, which is often accused of spying for the Chinese government, partnered with the British ISP TalkTalk in 2010 to develop the technology, and that Huawei currently operates the filtering software. Huawei has denied the claim, saying that it doesn’t “run” Talk Talk’s Homesafe, but rather, that the system is “supported by Huawei”. According to the BBC, however, U.K.-based Huawei employees can decide which sites Homesafe blocks.
At present, it’s hard to know how the Cameron plan will work—subscribers will be given a deal option—but just what that will look like still remains to be seen. Whatever the options are, the U.K.—and Cameron—is likely to see much more controversy on this topic.