Смекни!
smekni.com

Computers History And Development Essay Research Paper (стр. 2 из 2)

Fourth Generation (1971-Present)

After the integrated circuits, the only place to go was down – in size,

that is. Large scale integration (LSI) could fit hundreds of components

onto one chip. By the 1980’s, very large scale integration (VLSI) squeezed

hundreds of thousands of components onto a chip. Ultra-large scale integration

(ULSI) increased that number into the millions. The ability to fit so

much onto an area about half the size of a U.S. dime helped diminish the

size and price of computers. It also increased their power, efficiency

and reliability. The Intel 4004 chip,

developed in 1971, took the integrated circuit one step further by locating

all the components of a computer (central processing unit, memory, and

input and output controls) on a minuscule chip. Whereas previously the

integrated circuit had had to be manufactured to fit a special purpose,

now one microprocessor could be manufactured and then programmed to meet

any number of demands. Soon everyday household items such as

microwave ovens, television sets and automobiles

with electronic fuel injection

incorporated microprocessors.

Such condensed power allowed everyday people to harness a computer’s

power. They were no longer developed exclusively for large business or

government contracts. By the mid-1970’s, computer manufacturers sought

to bring computers to general consumers. These minicomputers came complete

with user-friendly software packages that offered even non-technical users

an array of applications, most popularly word processing and spreadsheet

programs. Pioneers in this field were Commodore,

Radio Shack and Apple

Computers. In the early 1980’s, arcade

video games such as Pac Man and

home video game systems such as the

Atari 2600 ignited consumer interest for more sophisticated, programmable

home computers.

In 1981, IBM introduced its personal computer (PC) for use in the home,

office and schools. The 1980’s saw an expansion in computer use in all

three arenas as clones of the IBM PC made the personal computer even more

affordable. The number of personal computers in use more than doubled

from 2 million in 1981 to 5.5 million in 1982. Ten years later, 65 million

PCs were being used. Computers continued their trend toward a smaller

size, working their way down from desktop to laptop computers (which could

fit inside a briefcase) to palmtop (able to fit inside a breast pocket).

In direct competition with IBM’s PC was Apple’s Macintosh line, introduced

in 1984. Notable for its user-friendly design, the Macintosh offered an

operating system that allowed users to move screen icons instead of typing

instructions. Users controlled the screen cursor using a mouse, a device

that mimicked the movement of one’s hand on the computer screen.

As computers became more widespread in the workplace, new ways to harness

their potential developed. As smaller computers became more powerful,

they could be linked together, or networked, to share memory space, software,

information and communicate with each other. As opposed to a mainframe

computer, which was one powerful computer that shared time with many terminals

for many applications, networked computers allowed individual computers

to form electronic co-ops. Using either direct wiring, called a Local

Area Network (LAN), or telephone lines, these networks could reach

enormous proportions. A global web of computer circuitry, the Internet,

for example, links computers worldwide into a single network of information.

During the 1992 U.S. presidential election, vice-presidential candidate

Al Gore

promised to make the development of this so-called "information superhighway"

an administrative priority. Though the possibilities envisioned by Gore

and others for such a large network are often years (if not decades) away

from realization, the most popular use today for computer networks such

as the Internet is electronic mail, or E-mail, which allows users to type

in a computer address and send messages through networked terminals across

the office or across the world.

Fifth Generation (Present and Beyond)

Defining the fifth generation of computers is somewhat difficult because

the field is in its infancy. The most famous example of a fifth generation

computer is the fictional HAL9000

from Arthur

C. Clarke’s novel, 2001: A

Space Odyssey. HAL performed all of the functions currently

envisioned for real-life fifth generation computers. With artificial

intelligence, HAL could reason well enough to hold conversations with

its human operators, use visual input, and learn from its own experiences.

(Unfortunately, HAL was a little too human and had a psychotic breakdown,

commandeering a spaceship and killing most humans on board.)

Though the wayward HAL9000 may be far from the reach of real-life computer

designers, many of its functions are not. Using recent engineering advances,

computers are able to accept spoken

word instructions (voice recognition) and imitate human reasoning.

The ability to translate a foreign language is also moderately possible

with fifth generation computers. This feat seemed a simple objective at

first, but appeared much more difficult when programmers realized that

human understanding relies as much on context and meaning as it does on

the simple translation of words.

Many advances in the science of computer design and technology are coming

together to enable the creation of fifth-generation computers. Two such

engineering advances are parallel processing, which replaces von Neumann’s

single central processing unit design with a system harnessing the power

of many CPUs to work as one. Another advance is superconductor

technology, which allows the flow of electricity with little or no resistance,

greatly improving the speed of information flow. Computers today have

some attributes of fifth generation computers. For example, expert systems

assist doctors in making diagnoses by applying the problem-solving steps

a doctor might use in assessing a patient’s needs. It will take several

more years of development before expert systems are in widespread use.

Sources

Computers!,

Timothy Trainor and Diane Trainor

Infoculture

The Smithsonian Book of Information Age Inventions, Steven Lubar.

Houghton Mifflin Company, 1993.

Alan

Turing: The Enigma Andrew Hodges, 1983. Simon & Schuster,

New York.

"Insanely

Great," Steven Levy. Popular Science, February, 1994.

"Stevie

Wonder," Joseph Nocera. GQ, October, 1993.

"Reading

Apple’s Uncertain Future," MacWorld, October, 1993.

"Ripe

For Change," Michael Myer. Newsweek, August 29, 1994.

"Future

Games," James K. Willcox. Popular Mechanics, December,

1993

"Electronic

Worlds Without End," Keith Ferrell, Omni, October 1993.

"Mario’s

Big Brother," David Sheff. Rolling Stone, January 9, 1992.

"The

PC Week Stat Sheet: A Decade of Computing," PC Week. February

28, 1994.

"R.I.P

Commodore, 1954-1994," Tom R. Halfhill. Byte, August,

1994.

"Playing

Catch Up…" Jim Carlton, Wall Street Journal October

17, 1994.

Breakthrough

to the Computer Age, Harry Wulforst

IBM’s

Early Computers, Charles J. Bashe, Lyle R. Johnson, John H. Palmer,

Emerson Pugh.

The

Computer Comes of Age, R. Moreau

The

Computer Pioneers, David Ritchie

Zap:

The Rise and Fall of Atari, Scott Cohen

1993

Grolier’s Encyclopedia, Grolier Electronic Publishing, Inc.