Cybersense—Truth About Content Screening Isn’t Filtering Through

0

Politicians know that if enough people want something to be true, it might as well be.

Take the death penalty. Though studies have found no evidence it deters crime, so many people want to believe otherwise that both Al Gore and George W. Bush can confidently claim that it does.

The same dynamic applies to filtering software, which is designed to block access to pornography, hate speech and other inappropriate content on the Internet. People are so enamored with the idea behind these programs that some members of Congress want to force filters into every school and library that receives federally subsidized Internet equipment.

But technology isn’t as pliable as politics. The simple binary truth about filters is that they don’t work. Indeed, the latest proof of that fact is contained in a report from the very committee that Congress asked to come up with a way to protect children online.

The Child Online Protection Act Commission was created in 1998 as part of an anti-pornography bill, most of which was blocked by the courts. The group was supposed to seek out the best possible way to keep children away from pornography and other inappropriate content.

Parental responsibility

Filters were among many approaches studied by the group, which included anti-pornography activists and a cyber-libertarian among a roster of Internet industry executives and government officials. The group also considered ratings systems, age-verification schemes and proposals to divide the Internet into areas both appropriate and off-limits to kids.

Ultimately, the commission chose none of the above. In its Oct. 20 report to Congress (available at www.copacommission.org/ report), the group said the federal government should educate parents about the issue, ask pornographers to impose voluntary restrictions, and enforce existing obscenity laws.

“The commission concludes that new laws would not only be constitutionally dubious, they would not effectively limit children’s access to inappropriate materials,” wrote commission member Jerry Berman of the Center for Democracy and Technology.

The commission found software filters were somewhat effective in blocking inappropriate content. On a scale of one to 10, members awarded various filters grades ranging from 5.4 to 7.4 a gentleman’s C at best.

No filter blocks every objectionable site, the commission found. Moreover, they often censor sites that don’t contain any inappropriate content, creating significant First Amendment concerns for libraries and schools that use them.

Part of the problem is that the Web changes far too quickly for any one company to keep up with. And in their haste to review as many sites as possible, the people categorizing these sites make plenty of mistakes.

Peacefire (www.peacefire.org), a nonprofit filtering watchdog group, found that nearly four of five sites classified by one popular filter as pornographic were actually innocuous. Other filtering groups are nearly as bad, and some display a political bias by blocking gay rights pages and other sites objectionable to conservatives.

To make things worse, most makers of filtering software refuse to release lists of the sites their programs block. This makes it impossible for users, including teachers and librarians, to determine if their software matches their own standards.

Malfunctioning products

While parents are free to trust some software company’s definitions of obscenity and hate speech, the Supreme Court obliges local government agencies to act individually in the interests of their communities.

Had the commission found in favor of filters, Republican leaders in Congress would be crowing about it. Instead, they have ignored their own group’s findings and pressed on with efforts to force these malfunctioning products into every school or library that accepts federal “e-rate” subsidies.

Why? Because even though filters are bad software, they make good politics.

An overwhelming majority of Americans 92 percent, to be precise believe schools should use software filters to block pornography, according to a recent survey conducted by a high-tech industry consortium called the Digital Media Forum. Seventy-two percent of the 1,900 people surveyed said filters should be used to block hate speech.

So, it doesn’t really matter that filters can’t do either of those things effectively. Since so many people want to believe they do, it makes good political sense for members of Congress to play along.

After all, if filters are incapable of screening out offensive content, they haven’t a prayer of blocking political pandering.

To contact syndicated columnist Joe Salkowski, you can e-mail him at [email protected] or write to him c/o Tribune Media Services Inc., 435 N. Michigan Ave., Suite 1400, Chicago, IL, 60611.

No posts to display