From WikiChip
Difference between revisions of "microprocessor"

(History)
m (Inject moved page Microprocessor to microprocessor)
(No difference)

Revision as of 13:11, 4 January 2015

Text document with shapes.svg This article is still a stub and needs your attention. You can help improve this article by editing this page and adding the missing information.

A microprocessor (µP) is a single integrated circuit chip that contains the central processing unit (CPU). A microprocessor is a multipurpose, digital device, that reads in digital data consisting of values and instructions; executes them by interpreting the instructions and performing a certain operation; and finally outputs a result. Today, microprocessors can be found in just about every digital device from watches to TVs to phones and laptops. The worldwide annual production of microprocessors stand at over 200 billion units per year.[1][2]

Since the inception of the first microprocessor in 1971, the Intel 4004, microprocessors have evolved into a large discipline in the computer engineering industry. Many different design patterns have been developed over the years which resulted in many different instruction set architectures. With the constant advancements of fabrication capabilities, microprocessors have been increasingly getting more complex, more powerful, and more compact.

History

The year of solid state electronics began in the late 1940s with the introduction of the bipolar transistor. Within a decade, multiple transistors and resistors soldered onto a single semiconductor substrate were being manufactured. By the late 1950s and early 60s large fabrication manufacturing companies were established such as Fairchild Semiconductor.

During most of the 1960s ICs were made of a few dozen to a few hundreds transistors. These small and medium scale integrated circuits were soldered together in various way to form more complex logic. Early computers were made of hundreds to thousands such discrete logic chips. These systems suffered from many problems such as large power consumption, heat issues, and latency issues.

With the advancement in fabrication technology, more transistors were fabricated onto a single chip. During the late 1960s there was a growing idea that all the necessary logic to make a functioning computer could be fabricated onto a single chip. The first computer on a single integrated circuit patent was filed by Gilbert P. Hyatt on December 28, 1970 (Patent US4942516).

See also

References