Australia’s Labour government recently announced that they would be implementing a two-tiered, national content-filtering scheme for all Internet traffic. The proposal as it stands is that people will have a choice of Internet connections: The first will block all Internet content considered unsafe for children. The second will allow adult content, but block anything deemed illegal under Australian law. People can choose one or the other, but they must choose one.
As with all public content-filtering schemes, this idea is well-intentioned, but fatally flawed.
Australia’s Labour government recently announced that they would be implementing a two-tiered, national content-filtering scheme for all Internet traffic. The proposal as it stands is that people will have a choice of Internet connections: The first will block all Internet content considered unsafe for children. The second will allow adult content, but block anything deemed illegal under Australian law. People can choose one or the other, but they must choose one.
As with all public content-filtering schemes, this idea is well-intentioned, but fatally flawed.
National content filtering is an inefficient and fundamentally faulty technical approach that deputises the nation’s Internet Service Providers to the role of neighbourhood sherriff, something they’re not at all comfortable with. Second, and more importantly, it creates a dangerous legal and moral precedent that is difficult to distinguish from the infamous Great Firewall of China, which is regularly used to stifle social and political dissent.
Indeed, a spokesman for the online rights group Electronic Frontiers Australia recently said, “I’m not exaggerating when I say that this model involves more technical interference in the internet infrastructure than what is attempted in Iran, one of the most repressive and regressive censorship regimes in the world.”
The issue of content filtering is, on the face of it, appealing. The idea is that one should be able to block objectionable material in order to make the Web ‘safe’ for children.
In principle, that’s a commendable thing. Indeed, it’s the responsible thing to do in certain circumstances. The Vanuatu IT Users Society strongly supports the Ministry of Education’s decision to use content filtering software in its schools. Adults have a duty to take reasonable steps to ensure that their children aren’t endangered in any way, and content filtering is a useful tool in this regard.
But technology alone is not sufficient to protect our children. The number of websites producing content unsuitable for children is immense, and content filters simply can’t block all of them, even if they’re updated every day.
And content filtering comes at a price. One important shortcoming of this technology is that it often blocks perfectly legitimate material as well. Over-sensitive filters applied over-zealously often have absurd results. In one famous case, an online service replaced the letters ‘a-s-s’ with ‘butt’ and ‘t-i-t’ with ‘breast’ every time they appeared on their site. The result looked something like this:
“We have buttiduously canvbutted the industry, buttessed what is available and buttembled the finest selection of contractors for this buttignment. The filters will buttociatively clbuttify all communications and filter then, I can butture you, rebuttemble them with surpbutting exacbreastude in any quanbreasty.”
Jokes aside, content filters have their place, but it is not on public networks. In every pluralistic society, there will be a huge variance about what people consider acceptable and what they do not. To be sure, there are some things that all of us decry, but there are many more that we could never agree on.
Attempting to find a single set of rules to apply to an entire nation is a fool’s errand. It’s certain to create acrimony and accusations of censorship.
Whether intentional or not, there would be censorship, too. Content filtering systems typically mis-identify between 2 and 10% of all content. In strictly numerical terms, this means that millions of websites would be falsely blocked.
Studies conducted while assessing Australia’s national content-filtering scheme found that various candidate services slowed traffic down 18 to 78%. An individual school or household might be able to justify this kind of slowdown, but to enforce it on an entire population would unfairly jeopardise online business activity and make everyone’s surfing experience feel like walking waist-deep in treacle.
Australian Senator Steven Conroy angrily denounced concerns raised by opponents of the plan, saying “If people equate freedom of speech with watching child pornography, then the Rudd Labor Government is going to disagree.”
That’s a disingenuous argument at best. And it’s more than a little worrying that the very minister with the power to limit what citizens are allowed to see is so contemptuous of reasoned criticism. With a national content-filtering scheme in place, he would have the ability – and, he might say, the mandate – to block such dissenting views entirely.
The mere principle of the thing is troubling. In order to block illegal content, you have to see what every bit of traffic is doing. In effect, it’s like stopping every driver on every road, every day, just in case one of them is drunk.
Happily, such a thing can’t happen here.
Vanuatu’s constitution explicitly constrains the government from conducting illegal search and seizure. It cannot interfere in the lives of its citizens without cause. Creating a national content-filtering system here would be likely be found illegal, because it amounts to government inspection of all its citizens’ private communications. The Supreme Court would never allow the government to listen to every telephone call, to open every letter or even to read every postcard sent. Content-filtering on a national basis would be effectively the same thing.
In this age of technological innovation, it’s often difficult to fight the temptation to treat every challenge as a technical one. Information and communications technology has done much to simplify our lives, and has made some things possible that we only dreamed of before. It also creates any number of liabilities. It presents new threats to us on a regular basis.
Ultimately, we combat these threats as we always have. We look to ourselves and our community to protect our values, we try with every step to stay on the straightest road. We use our own good judgement.
And the one thing that computers will never possess is judgement. They can never take the place of a strong moral compass, that sense of right and wrong that we learned in the arms of our family, our church, our community. Any attempt to replace this fundamental good sense with a tool loaded up with any number of arbitrary rules is bound to fail.
Freedom comes at a price. We know that some people here look at pornography. We know that criminals conspire over the phone. We know that people write objectionable things to one another. But that doesn’t give us the right to treat the entire nation like potential crooks.
The vast majority of people are law-abiding individuals, and it’s one of the tenets of a free society that we assume every person is innocent until they demonstrate otherwise.
The people who manage our national infrastructure face the constant temptation to peer and poke into our communications, sometimes with the best of intentions, sometimes not. But just because they can doesn’t mean they should. The desire to snoop is a temptation that must be resisted. I am sure that if it came out that one of our telephone companies were eavesdropping on our calls, there would be a national outcry. Listening in to a nation’s Internet communications is just as intrusive, and should be just as vehemently opposed.