Powered By Blogger

Saturday, February 5, 2011

Introduction to Computer

We begin our introduction to computers with a brief history of how they evolved. Although this course and the A+ exam focus on the modern electronic computer, many principles used in early computational machines still apply to their modern successors. With a summary of computer development and discussion of the role of today's computer professional, this chapter lays the foundation for the chapters that follow.


In this lesson, we take a brief look at the development of the computer. By understanding its origins, you'll gain an appreciation for both the complexity and simplicity of today's computers.

After this lesson, you will be able to
  • Describe the major milestones in the development of the modern computer
Estimated lesson time: 15 minutes

Many of us think only in terms of electronic computers, powered by electricity. (If you can't plug it in, is it a computer?) But as the definition in Funk & Wagnalls Standard College Dictionary makes clear, to "compute" is to "ascertain (an amount or number) by calculation or reckoning." In fact, the first computers were invented by the Chinese about 2500 years ago. They are called abacuses and are still used throughout Asia today.
The abacus, shown in Figure 1.1, is a calculator; its first recorded use was circa 500 B.C. The Chinese used it to add, subtract, multiply, and divide. However, the abacus was not unique to the continent of Asia; archeological excavations have revealed an Aztec abacus in use around 900 or 1000 A.D.
Click to view at full size

The Analytical Engine (A Pre-Electronic Computer)
The first mechanical computer was the analytical engine, conceived and partially constructed by Charles Babbage in London, England, between 1822 and 1871. It was designed to receive instructions from punched cards, make calculations with the aid of a memory bank, and print out solutions to math problems. Although Babbage lavished the equivalent of $6,000 of his own money—and $17,000 of the British government's money—on this extraordinarily advanced machine, the precise work needed to engineer its thousands of moving parts was beyond the ability of the technology of the day to produce in large volume. It is doubtful whether Babbage's brilliant concept could have been realized using the available resources of his own century. If it had been, however, it seems likely that the analytical engine could have performed the same functions as many early electronic computers.
The first computer designed expressly for data processing was patented on January 8, 1889, by Dr. Herman Hollerith of New York. The prototype model of this electrically operated tabulator was built for the U.S. Census Bureau to compute results of the 1890 census.
Using punched cards containing information submitted by respondents to the census questionnaire, the Hollerith machine made instant tabulations from electrical impulses actuated by each hole. It then printed out the processed data on tape. Dr. Hollerith left the Census Bureau in 1896 to establish the Tabulating Machine Company to manufacture and sell his equipment. The company eventually became IBM, and the 80-column punched card used by the company, shown in Figure 1.2, is still known as the Hollerith card.
Click to view at full size
The first modern digital computer, the ABC (Atanasoff–Berry Computer), was built in a basement on the Iowa State University campus in Ames, Iowa, between 1939 and 1942. The development team was led by John Atanasoff, a professor of physics and mathematics, and Clifford Berry, a graduate student. This machine utilized concepts still in use today: binary arithmetic, parallel processing, regenerative memory, separate memory, and computer functions. When completed, it weighed 750 pounds and could store 3000 bits (.4 KB) of data.
The technology developed for the ABC machine was passed from Atanasoff to John W. Mauchly, who, together with engineer John Presper Eckert, developed the first large-scale digital computer, ENIAC (Electronic Numerical Integrator and Computer). It was built at the University of Pennsylvania's Moore School of Electrical Engineering. Begun as a classified military project, ENIAC was designed to prepare firing and bombing tables for the U.S. Army and Navy. When finally assembled in 1945, ENIAC consisted of 30 separate units, plus a power supply and forced-air cooling. It weighed 30 tons, and used 19,000 vacuum tubes, 1500 relays, and hundreds of thousands of resistors, capacitors, and inductors. It required 200 kilowatts of electrical power to operate.
Although programming ENIAC was a mammoth task requiring manual switches and cable connections, it became the workhorse for the solution of scientific problems from 1949 to 1952. ENIAC is considered the prototype for most of today's computers.
Another computer history milestone is the Colossus I, an early digital computer built at a secret British government research establishment at Bletchley Park, Buckinghamshire, England, under the direction of Professor Max Newman. Colossus I was designed for a single purpose: cryptanalysis, or code breaking. Using punched paper tape input, it scanned and analyzed 5000 characters per second. Colossus became operational in December 1943 and proved to be an important technological aid to the Allied victory in World War II. It enabled the British to break the otherwise impenetrable German "Enigma" codes.
The 1960s and 1970s marked the golden era of the mainframe computer. Using the technology pioneered with ABC, ENIAC, and Colossus, large computers that served many users (with accompanying large-scale support) came to dominate the industry.
As these highlights show, the concept of the computer has indeed been with us for quite a while. The following table provides an overview of the evolution of modern computers—it is a timeline of important events.
Year
Events
1971
The 4004—the first 4-bit microprocessor—is introduced by Intel. It boasts 2000 transistors with a clock speed of up to 1 megahertz (MHz).
1972
The first 8-bit microprocessor—the 8008—is released.
1974
The 8080 microprocessor is developed. This improved version of the 8008 becomes the standard from which future processors will be designed.
1975
Digital Research introduces CP/M—an operating system for the 8080. The combination of software and hardware becomes the basis for the standard computer.
1976
Zilog introduces the Z80—a low-cost microprocessor (equivalent to the 8080).
The Apple I comes into existence, although it is not yet in widespread use.
1977
The Apple II and the Commodore PET computers, both of which use a 6502 processor, are introduced. These two products become the basis for the home computer. Apple's popularity begins to grow.
1978
Intel introduces a 16-bit processor, the 8086, and a companion math coprocessor, the 8087.
Intel also introduces the 8088. It is similar to the 8086, but it transmits 8 bits at a time.
1980
Motorola introduces the 68000—a 16-bit processor important to the development of Apple and Atari computers. Motorola's 68000 becomes the processor of choice for Apple.
1981
The IBM personal computer (PC) is born; it contains a 4.7-MHz 8088 processor and 64 kilobytes (KB) of RAM (random access memory), and is equipped with a version of MS-DOS 1.0 (three files and some utilities).
Available mass-storage devices include a 5.25-inch floppy drive and a cassette tape drive.
1982
Intel completes development of the 80286—a 16-bit processor with 150,000 transistors.
MS-DOS 1.1 now supports double-sided floppy disks that hold 360 KB of data.
1983
IBM introduces the XT computer with a 10-MB hard disk drive.
MS-DOS 2.0 arrives; it features a tree-like structure and native support for hard disk drive operations.
1984
The first computer with an 80286 chip—the IBM AT—enters the market.
It is a 6-MHz machine with a 20-MB hard disk drive and a high-density, 1.2-MB 5.25-inch floppy disk drive.
Apple introduces the Macintosh computer, marking the first widespread use of the graphical user interface and mouse.
1985
MS-DOS 3.2, which supports networks, is released.
1986
The first Intel 80386-based computer is introduced by Compaq; it features a 32-bit processor with expanded multitasking capability (even though no PC operating system yet fully supports the feature).
1987
MS-DOS 3.3 arrives, allowing use of 1.44-MB 3.5-inch floppy disk drives and hard disk drives larger than 32 MB.
1988
IBM introduces the PS/2 computer series. A complete departure from previous machines, its proprietary design does not support the hardware and software available on IBM PCs or clones.
Microsoft (with the help of IBM) develops OS/2 (Operating System 2), which allows 32-bit operations, genuine multitasking, and full MS-DOS compatibility.
Microsoft releases MS-DOS 4.0.
1989
Intel introduces the 80486 processor; it contains an on-board math coprocessor and an internal cache controller (offering 2.5 times the performance of a 386 processor with a supporting coprocessor).
1991
MS-DOS 5.0 offers a significantly improved DOS shell.
1992
The Intel i586 processor, the first Pentium, is introduced, offering 2.5 times the performance of a 486.
Microsoft introduces Windows 3.1, vastly expanding the use of a graphical user interface in the mass market. IBM expands OS/2.
1993
MS-DOS 6.0 arrives. The term "multimedia" (the inclusion of CD-ROM drives, sound cards, speakers, and so forth, as standard equipment on new personal computers) comes into use.
1994
Intel delivers the first 100-MHz processor. Compaq Computer Corporation becomes the largest producer of computers.
1995
Windows 95, code-named Chicago, is introduced by Microsoft. It features 32-bit architecture.
The Internet, having expanded far beyond its beginnings as a network serving government and university institutions, is now in everyday use by the rapidly growing proportion of the population with access to a modem.
Computer prices drop as performance increases. IBM purchases Lotus (maker of the popular Lotus1-2-3 spreadsheet).
1995-1996
Software manufacturers scramble to make their products compatible with Windows 95.
1997
Microprocessor speeds exceed the 200-MHz mark. Hard disk drive and memory prices fall as basic system configuration sizes continue to increase.
CD-ROM drives and Internet connections have become standard equipment for computers.
1998
PC performance continues to soar and prices continue to fall. Central processing unit (CPU) speeds exceed 450 MHz, and motherboard bus speeds reach 100 MHz
Entry-level machines are priced near the $500 mark.
Universal serial bus (USB) is introduced.
Windows 98 becomes the standard operating system for most new personal computers. Computer prices drop well under $1,000, increasing computer sales to the home market.
1999
Processor speeds exceed 1 gigahertz (GHz). E-commerce grows dramatically as the Internet expands.
2000
Microsoft releases Windows 2000 and the basic PC becomes a commodity item in discount stores. Broadband connections such as DSL and cable begin to take hold, making Internet access easier and faster than over the telephone line.
The following points summarize the main elements of this lesson:
  • The concepts that form the basis of computer technology have a long history that stretches back 2500 years.
  • Rudimentary, electrically powered computers were first developed in the 1950s and 1960s.
  • The "standard" PC has undergone several stages of evolution, characterized by improvements to the processor, internal architecture, and types of storage devices.

No comments:

Post a Comment