The fourth generation of computers began around 1975 and lasted until around 1985. It recognizes that period of computer history when the integrated circuit chip evolved into the microprocessor, a "computer on a chip." As a result, the first functional desktop computers came into being, beginning with the hobbyist DIY experimental models, such as the Altair 8800 mail-order kit, and progressing to the early commercial models such as the Commodore and the Tandy TRS-80. The period marks the successful introduction and mass production of the early desktop models of the IBM PC, its several clones, and the Apple Macintosh.
A star of the previous generation of computers had been the 1960s Control Data CD 1604 computer. In order to process data it had some 25,000 transistors and 100,000 diodes among thousands of resistors and capacitors, all individually wired together.
The microprocessor was in route to do all the things the CD 1604 did on one chip. It had its birth when researchers at Intel integrated all the processing functions of arithmetic, logic, and control together onto one chip through a process of photolithography.
The CPU read the data and instructions that came in as bytes of 8-bit code. The reading involved performing arithmetic and logic calculations on the code. The resulting data and instructions further allowed control functions to order the code into various streams of data that were written or received as graphics output on a monitor.
The integrated microprocessor chip became known as the central processing unit -- the CPU -- or the "brains" of the model computer. Its entrance heightened the earlier 1958-1959 inventions of the integrated circuit chip by Jack Kilby, at Texas Instruments, and Robert Noyce, then at Fairchild Semiconductor. These two engineers had independently miniaturized the transistor and created the IC chip as a solid-state piece of silicon (or germanium). Their discoveries had essentially brought in the new age of solid-state electronics.
Kilby received the Nobel Prize for the IC chip while Noyce continued its development as founder of the Intel Corporation. Meanwhile, the solid state miniaturization of electronic components immediately pushed technology into new bounds of advances in space, defense and consumer projects. By the 1970s, large-scale integration (LSI) of tens of thousands of transistors on one chip would eventually lead to very-large-scale integration (VLSI) with millions and, then, billions of transistors per chip after the turn of the century.
Under Noyce, Intel released the first CPU-status microprocessor, the 4004, November 15, 1971. The company also developed the first random access memory chip, the RAM chip, to provide temporary storage for the CPU. The 4004 could process 60,000 (60K) instructions per second. It was not until Intel produced the 8-bit 8080 microprocessor, April 1974, that the desktop revolution really began to bloom.
The 8080 had some 6,000 transistors miniaturized by photolithography onto one microprocessor chip. It had a clock speed of 2 MHz, and it could process several hundred thousand instructions per second.
Soon, hobbyists were ordering the MITS Altair 8800, a bare-knuckles computer using the 8080 microprocessor, after it was advertised on the 1975 cover of Popular Electronics. An interpreter unit of the BASIC programming language to boot and instruct the computer had been designed by Bill Gates and Paul Allen.
In 1976, Steve Wozniak and Steve Jobs founded Apple Computer, Inc to begin experiments with their first computer models. They used the Z80 microprocessor as a CPU. The two founders began to mass produce their Apple II microcomputer in 1977.
Xerox, Inc. was an important experimenter in the early desktop technologies. By the mid-1970s, Xerox had put together a desktop version of a minicomputer system called the Alto, at their Palo Alto Research Center. Xerox had done much research in using graphics. All the early desktop models used command line controls where the user would type in a line of instruction at the command prompt. Early desktop makers such as Steve Jobs visited Xerox PARC and received ideas on graphics user interfaces and the mouse.
Other third generation of computer milestones include the advent of the IBM PC, with an operating system from Microsoft, and the 1984 introduction of the Apple Macintosh. IBM released the first version of its IBM PC August 1981. By 1982, it was shipping with MS-DOS as the operating system. The next advance model was the IBM PC/AT released August 1984, based on the Intel 16-bit 80286 CPU, with 134,000 transistors. This CPU could reach a speed of 8Mhz. Many clones were produced based on the IBM-PC models, notably from Compac.
With distinctive flair, during a Super Bowl XVIII commercial, January 22, 1984, Apple released the Macintosh desktop computer. Setting the Apple trademark, the Macintosh became known for its engaging graphics capabilities. The system, with a Motorola 16-bit 68k CPU, was proprietary and could not be cloned.
Dear Readers,
I run TechiWarehouse.Com and I passionately believe in Free Computer Learning for everyone.
Visit my site to read 1000+ computer tutorials.
No Signups Needed!
Thank You,
Ali Gheli
http://www.techiwarehouse.com
Article Source: http://EzineArticles.com/?expert=Ali_Gheli
A star of the previous generation of computers had been the 1960s Control Data CD 1604 computer. In order to process data it had some 25,000 transistors and 100,000 diodes among thousands of resistors and capacitors, all individually wired together.
The microprocessor was in route to do all the things the CD 1604 did on one chip. It had its birth when researchers at Intel integrated all the processing functions of arithmetic, logic, and control together onto one chip through a process of photolithography.
The CPU read the data and instructions that came in as bytes of 8-bit code. The reading involved performing arithmetic and logic calculations on the code. The resulting data and instructions further allowed control functions to order the code into various streams of data that were written or received as graphics output on a monitor.
The integrated microprocessor chip became known as the central processing unit -- the CPU -- or the "brains" of the model computer. Its entrance heightened the earlier 1958-1959 inventions of the integrated circuit chip by Jack Kilby, at Texas Instruments, and Robert Noyce, then at Fairchild Semiconductor. These two engineers had independently miniaturized the transistor and created the IC chip as a solid-state piece of silicon (or germanium). Their discoveries had essentially brought in the new age of solid-state electronics.
Kilby received the Nobel Prize for the IC chip while Noyce continued its development as founder of the Intel Corporation. Meanwhile, the solid state miniaturization of electronic components immediately pushed technology into new bounds of advances in space, defense and consumer projects. By the 1970s, large-scale integration (LSI) of tens of thousands of transistors on one chip would eventually lead to very-large-scale integration (VLSI) with millions and, then, billions of transistors per chip after the turn of the century.
Under Noyce, Intel released the first CPU-status microprocessor, the 4004, November 15, 1971. The company also developed the first random access memory chip, the RAM chip, to provide temporary storage for the CPU. The 4004 could process 60,000 (60K) instructions per second. It was not until Intel produced the 8-bit 8080 microprocessor, April 1974, that the desktop revolution really began to bloom.
The 8080 had some 6,000 transistors miniaturized by photolithography onto one microprocessor chip. It had a clock speed of 2 MHz, and it could process several hundred thousand instructions per second.
Soon, hobbyists were ordering the MITS Altair 8800, a bare-knuckles computer using the 8080 microprocessor, after it was advertised on the 1975 cover of Popular Electronics. An interpreter unit of the BASIC programming language to boot and instruct the computer had been designed by Bill Gates and Paul Allen.
In 1976, Steve Wozniak and Steve Jobs founded Apple Computer, Inc to begin experiments with their first computer models. They used the Z80 microprocessor as a CPU. The two founders began to mass produce their Apple II microcomputer in 1977.
Xerox, Inc. was an important experimenter in the early desktop technologies. By the mid-1970s, Xerox had put together a desktop version of a minicomputer system called the Alto, at their Palo Alto Research Center. Xerox had done much research in using graphics. All the early desktop models used command line controls where the user would type in a line of instruction at the command prompt. Early desktop makers such as Steve Jobs visited Xerox PARC and received ideas on graphics user interfaces and the mouse.
Other third generation of computer milestones include the advent of the IBM PC, with an operating system from Microsoft, and the 1984 introduction of the Apple Macintosh. IBM released the first version of its IBM PC August 1981. By 1982, it was shipping with MS-DOS as the operating system. The next advance model was the IBM PC/AT released August 1984, based on the Intel 16-bit 80286 CPU, with 134,000 transistors. This CPU could reach a speed of 8Mhz. Many clones were produced based on the IBM-PC models, notably from Compac.
With distinctive flair, during a Super Bowl XVIII commercial, January 22, 1984, Apple released the Macintosh desktop computer. Setting the Apple trademark, the Macintosh became known for its engaging graphics capabilities. The system, with a Motorola 16-bit 68k CPU, was proprietary and could not be cloned.
Dear Readers,
I run TechiWarehouse.Com and I passionately believe in Free Computer Learning for everyone.
Visit my site to read 1000+ computer tutorials.
No Signups Needed!
Thank You,
Ali Gheli
http://www.techiwarehouse.com
Article Source: http://EzineArticles.com/?expert=Ali_Gheli