Смекни!
smekni.com

Internet Essay Research Paper The Internet has

Internet Essay, Research Paper

The Internet has update the computer and communications world like nothing before. The invention of the telegraph, telephone, radio, and computer set the stage for this unprecedented integration of capabilities. The Internet is at once a world-wide broadcasting capability, a mechanism for information distribution, and a medium for collaboration and interaction between individuals and their computers without regard for geographic location.

The Internet represents one of the most successful examples of the benefits of sustained investment and commitment to research and development of information infrastructure. Beginning with the early research in packet switching, the government, industry and academia have been partners in evolving and deploying this exciting new technology. Over its fifteen year history, the Internet has functioned as a collaboration among cooperating parties. Certain key functions have been critical for its operation, not the least of which is the specification of the protocols by which the components of the system operate.

To get to the origins of the Internet, we have to go back in time to 1957. You probably have no cause to remember, but it was International Geophysical Year, a year dedicated to gathering information about the upper atmosphere during a period of intense solar activity. Eisenhower announced in 1955 that, as part of the activities, the USA hoped to launch a small Earth orbiting satellite. Then Kremlin announced that it hoped to do likewise. Planning in America was focussed on a sophisticated three stage rocket, but in Russia they took a more direct approach, on 4 October 1957 the USSR launched (a 70 kgs bleeping sphere the size of a medicine ball) into Earth orbit. The effect in the United States was electrifying, since it seemed overnight to wipe out the feeling on invulnerability the country had enjoyed since the explosion of the first nuclear bomb thirteen years before. One of the immediate reactions was the creation of the Advanced Research Projects Agency within the Ministry of Defense. Its mission was to apply state-of-the-art technology to US defense and to avoid being surprised (again!) by technological advances of the enemy. It was also given interim control of the US satellite program until the creation of NASA in October 1958.

ARPA became the technological think-tank of the American defense effort, employing directly a couple of hundred top scientists and with a budget sufficient for sub-contracting research to other top American institutions. Although the advanced computing would come to dominate its work, the initial focus of ARPA’s activities were on space, ballistic missiles and nuclear test monitoring. Even so, from the start ARPA was interested in communicating between its operational base and its sub-contractors, preferably through direct links between its various computers. In October 1972 ARPANET went ‘public’. At the First International Conference on Computers and Communication, held in Washington DC, ARPA scientists demonstrated the system in operation, linking computers together from 40 different locations. This stimulated further research in scientific community throughout the Western World. Soon other networks would appear.

Here we have the first true computer network. Since it is all still fairly basic, it is worth considering the underlying principles have basically remained the same (even if they, mercifully, operate far faster and look much prettier). We start off with a passive terminal and an active host, a keyboard and a computer. They are linked together by a cable. By typing in commands recognized by a computer, you can use the programs stored in its computer, access its files (and modify them and print them out as desired). Most people can envisage this arrangement within a single building, or complex of buildings.

The original ARPANET grew into the Internet. The Internet was based on the idea that there would be multiple independent networks of rather arbitrary design, beginning with the ARPANET as the pioneering packet switching network, but soon to include packet satellite networks, ground-based packet radio networks and other networks. The Internet as we now know it embodies a key underlying technical idea, namely that of open architecture networking. In this approach, the choice of any individual network technology was not dictated by a particular network architecture but rather could be selected freely by a provider and made to interwork with the other networks through a meta-level “Internetworking Architecture”. Up until that time there was only one general method for federating networks. This was the traditional circuit switching method where networks would interconnect at the circuit level, passing individual bits on a synchronous basis along a portion of an end-to-end circuit between a pair of end locations. Recall that Kleinrock had shown in 1961 that packet switching was a more efficient switching method. Along with packet switching, special purpose interconnection arrangements between networks were another possibility. While there were other limited ways to interconnect different networks, they required that one be used as a component of the other, rather than acting as a peer of the other in offering end-to-end service.

In order to access another computer, at a completely different facility, we have first to reach it. This was usually done in these times over a (high speed) telephone line (or lines). Once you arrive at the new ‘host’ you have to convince it to treat you in the same way as someone behind a terminal within its own system. Hence the need of an interface message processor (IMP) and for the same IMP to be installed in both computers! Now you can access its files. Of course, order to preserve confidentiality, all computers differentiated between ‘open’ files and those that were password protected.

So far, the net’s development was almost entirely ’science-led’. All this time, however, we must remember that parallel advances in computer capacities and speeds (not to mention the introduction of glass-fiber cables into communications networks) were enabling the system to expand. This expansion, in its turn, started to produce supply constraints, which stimulated further advances. By the early 1980s, when the internet proper started operation, it was already beginning to face problems created by its own success. First, there were more computer ‘hosts’ linked to the net than had originally been envisaged (in 1984 the number of hosts topped 1000 for the first time) and, second, the volume of traffic per host was much larger (mainly because of the phenomenal success of e-mail). Increasingly predictions were voiced that the entire system would eventually grind to a halt.

The World Wide Web is a network of sites that can be searched and retrieved by a special protocol known as a Hypertext Transfer protocol (HTTP). The protocol simplified the writing of addresses and automatically searched the internet for the address indicated and automatically called up the document for viewing. Designed in 1989 by Tim Berners-Lee and scientists at CERN (Geneva), the European center for High Energy Physics, who were interested in making easier to retrieve research documentation. A year later he had developed a ‘browser/editor’ program and had coined the name World Wide Web as a name for the program. The program is released free on an ftp site. This doesn’t sound very dramatic but anyone used to the hassle of getting documents previously will testify that it represented a major leap forward. Once the entire dial- and retrieve-language had been simplified, the next step was to design an improved ‘browser’, a system which allowed the links to be hidden behind text (using a Hypertext Markup Language, HTML) and activated by a click with the ‘mouse’.

The Internet has become a test bed for development of other protocols. Since there was no lower level OSI infrastructure available, Marshall Rose proposed that the Internet could be used to try out X.400 and X.500. In RFC 1006, he proposed that we emulate TP0 on top of TCP, and so there was a conscious decision to help higher-level OSI protocols to be deployed in live environments before the lower-level protocols were available.

It seems likely that the Internet will continue to be the environment of choice for the deployment of new protocols and for the linking of diverse systems in the academic, government, and business sectors for the remainder of this decade and well into the next.

?h “The Internet Activities Board”, Vinton Cerf ; May 1990

?h “Charter of the Internet Architecture Board”; C. Huitema; Mar. 1994.

?h “The Internet Standards Process — Revision 3″; S. Bradner; 9 Oct. 1996.