Internet Censorship Essay, Research Paper
I work at Infowest, Utah s second largest Internet service provider, where I am a server administrator and webmaster. I frequently handle customer service questions, especially when our technicians are indisposed on other phone calls. I have frequently received calls like this one:
Caller: Why can t I access the web site www.cnn.com? My browser says it is blocked!
LM: Do you subscribe to our XStop service?
LM: Let me check the site. (I check the CNN web site for any reason why our censorship service would be blocking it… I find nothing questionable on the site.) The XStop service must be malfunctioning right now. I ll call them and tell them about the error and get back to you. Can I get your number?
I m amazed how many times I get calls like this, caused by Internet censorship failure, usually when the service censors something it shouldn t. But, of course, I shouldn t be surprised: There are many technical problems with Internet censorship. First of all the manpower and money needed to keep an accurate and up-to-date database of questionable sites, as most censorship firms do, are tremendous. The second problem is the lack of accurate content filtering software, especially on the client side. Also, the internet is multinational, and uses many protocols, languages that computers use to talk to each other and negotiate data exchanges, making it nearly impossible to police the entirety of the network. Lastly, there is the issue of data encryption, which makes censorship of encrypted data impossible.
The way the XStop service works is supposed to solve one of the major problems with Internet censorship software, namely the lack of manpower to keep up the database of questionable sites. The XStop service allows any user who finds a questionable site that isn t in the database to add that site to the database, with little or no review of the submission by anyone at XStop. This scheme saves XStop, the subscribing ISP, and the consumer who is subscribed to the ISP money. This is a fairly good solution to the problem of manpower: The service distributes the responsibility of keeping the database up to date to everyone on the Internet. That way they have the potential of several million people keeping their database up to date and accurate. Right? Wrong. There are many pranksters out there that add sites that don t have questionable content to the database, blocking access to those sites to anyone subscribed to the service. Now, many censorship proponents would say that this isn t much of a problem, the ISPs that use the XStop service, or others like it, can simply report problems to the services and have them fixed. This is fine for them to say: they don t have to deal with irate customers.
The truth is that most ISPs spend more money using and helping maintain services like XStop than they make by offering the service. This problem isn t limited to only services such as XStop; there are other companies out there that provide censorship software that is not publicly maintained, yet the same type of problems occur. Those that are maintaining the database can t cover the entirety of the Internet in their searches for questionable material. The Internet is so large that not even automated search engines, such as Yahoo!, Excite, or Google have all the pages indexed (Google has the largest index, at over a billion pages). The time and money it takes to keep an accurate, up to date database of questionable sites cost too much for any one entity to handle it.
The database upkeep problem isn t the only show stopper when it comes to Internet censorship. There is also the problem of filtering anything other than just the web. There are other services on the Internet that are perfect places to distribute pornographic or otherwise questionable content: FTP, Hotline, Usenet, and GNUtella variants being among the most popular. Filtering these services requires a type of filtering at the receiving end: client side filtering, mostly because they are password protected or use propriety protocols. The content must be downloaded first, then analyzed. The only reliable way to do automated data analysis, especially of graphics files, is some type of artificial intelligence scheme that can make some type of judgment as to whether the content is appropriate or not. There currently is no artificial intelligence technology that can do this without making many errors and, therefore, blocking potentially worthwhile content.
In addition to the problems of manpower and client side filtering is the issue of the sheer size and complexity of the Internet. Many people in America believe the Internet is owned and operated by the U.S. government. While the precursor to the Internet, ARPA net, was developed by the U.S. Department of Defense, the Internet itself has grown way beyond that original seed. The Internet is now a network of millions of computers, clients and servers, in every country on the globe. The rules and regulations of the United States can no longer regulate the entirety of the network. Things that are considered illegal in America can be freely posted on servers elsewhere, for example, England has no laws to keep pornography out of sight of children. People in America can access any server on the the network as they would access a server down the hall. The ambiguous nature of the Internet itself makes it impossible to pass laws banning the posting of pornography in places children can access. Every country in the world would have to pass the same law for it to be truly effective.
The multi protocol nature of the Internet is another contributor to the complexity of the Internet. New protocols for content transfer are being developed every day, rendering software that only censors the protocols of today obsolete in a short time. To add to matters, a great number of these protocols are proprietary, therefore any company wanting to censor the data carried over those protocols would have to license the protocol from the company that created it (potentially very expensive) or try to reverse engineer the protocol and risk a lawsuit.
The biggest stumbling block that is appearing now is encryption. Encrypted data can not be censored, period. The most common encryption in the world now, 2048 bit key escrow encryption, is impossible to break. Many cryptographers claim it would take all the world s supercomputers, working together, several trillion years to break a 2048 bit key. No censorship software could break that kind of encryption to censor the data inside
Censoring the Internet is a controversial subject. Many want it censored, and many do not, but from a technical standpoint it is impossible. The encryption issue is a stake through the heart of internet censorship: how can you know if something needs to be censored if you can t read it? Even without censoring, the multiple protocols on the internet, multiple law systems governing the internet, and the problems with manpower and client side censoring make it impossible to censor the internet in a way that is acceptable, a way that has a chance of censoring any content that may be worthwhile. The bickering may as well be stopped, censorship cannot truly happen on the Internet.