Menu
Is free
registration
home  /  Education/ The history of the development of microprocessors is brief. Microprocessor history

The history of the development of microprocessors is brief. Microprocessor history

Are you using a computer or mobile device to read this topic now. The computer or mobile device uses a microprocessor to perform these actions. The microprocessor is the heart of any device, server or laptop. There are many brands of microprocessors from the most different manufacturers, but they all do about the same and in about the same way.
Microprocessor- also known as a processor or central processing unit, is a computing engine that is made on a single chip. The first microprocessor was the Intel 4004, which appeared in 1971 and was not as powerful. He could add and subtract, and that's only 4 bits at a time. The processor was amazing because it was executed on a single chip. You will ask why? My answer is that engineers at that time were producing processors either from multiple chips or from discrete components (transistors were used in separate packages).

If you have ever wondered what a microprocessor does in a computer, what it looks like, or what are its differences compared to other types of microprocessors, then go under the cut- there are all the most interesting, and details.

Microprocessor Progress: Intel

The first microprocessor, which later became the heart of the simple home computer, was the Intel 8080, a complete 8-bit computer on a single chip that appeared in 1974. The first microprocessor caused a real surge in the market. Later in 1979 was released new model- Intel 8088. If you are familiar with the PC market and its history, then you know that the PC market has moved from Intel 8088 to Intel 80286, and that to Intel 80386 and Intel 80486, and then to Pentium, Pentium II, Pentium III and Pentium 4 All of these microprocessors are made by Intel, and they are all enhancements to the basic design of the Intel 8088. The Pentium 4 can execute any code, but it does it 5000 times faster.

In 2004, Intel introduced microprocessors with multiple cores and a million transistors, but even these microprocessors followed general rules as previously manufactured chips. Additional Information in the table:

  • date: is the year the processor was first introduced. Many processors were re-released, but with higher clock speeds, and this continued for many years after the original release date.
  • Transistors: This is the number of transistors on a chip. You can see that the number of transistors per die has been steadily increasing over the years.
  • Micron: width, in microns, of the smallest wire on the chip. For comparison, I can cite a human hair, which has a thickness of about 100 microns. As the dimensions got smaller and smaller, the number of transistors increased.
  • Clock frequency: the maximum speed the chip can reach. I will tell you about the clock frequency a little later.
  • Width (bus) data: is the width of the ALU (Arithmetic Logic Unit). An 8-bit ALU can add, subtract, multiply, etc. In many cases, the data bus is the same width as the ALU, but not always. Intel 8088 was 16-bit and had an 8-bit bus, while modern models Pentium 64-bit.
  • MIPS: This column in the table stands for displaying the number of operations per second. It is a unit of measure for microprocessors. Modern processors can do so many different things that today's ratings, presented in the table, become meaningless. But you can feel the relative power of the microprocessors of those times.
From this table you can see that, in general, there is a relationship between clock speed and MIPS (operations per second). The maximum clock speed is a function of the manufacturing processor. There is also a relationship between the number of transistors and the number of operations per second. For example, an Intel 8088 clocked at 5 MHz (now 2.5-3 GHz) only runs 0.33 MIPS (about one instruction for every 15 clock cycle). Modern processors can often execute two instructions per clock cycle. This increase is directly related to the number of transistors on the chip and I will talk about this further.

What is a chip?


A chip is also called an integrated circuit. Usually it is a small, thin piece of silicon on which the transistors that make up the microprocessor have been engraved. A chip can be as small as one inch, but still contain tens of millions of transistors. More simple processors can consist of several thousand transistors engraved on a chip of only a few square millimeters.

How it works



Intel Pentium 4

To understand how a microprocessor works, it would be helpful to look inside and learn about its internals. In the process, you can also learn about assembly language, the native language of the microprocessor, and a lot of what engineers can do to increase the speed of the processor.

The microprocessor executes a collection of machine instructions that tell the processor what to do. Based on the instructions, the microprocessor does three main things:

  • Using its ALU (Arithmetic Logic Unit), the microprocessor can perform mathematical operations. For example, addition, subtraction, multiplication, and division. Modern microprocessors are capable of extremely complex operations
  • Microprocessor can move data from one memory location to another
  • The microprocessor can make decisions and move on to a new set of instructions based on those decisions


To put it bluntly, a microprocessor does complex things, but above I described three main activities. The following diagram shows a very simple microprocessor capable of doing these three things. This microprocessor has:

  • Address bus (8, 16, or 32 bits) that sends the memory access
  • Data bus (8, 16 or 32 bits) that transfers data to memory or receives data from memory
  • RD (read) and WR (write) tell memory whether they want to install or get an addressed location
  • Clock line that allows you to view the processor clock sequence
  • A reset line that resets the command counter to zero and restarts execution

Microprocessor memory

Earlier we talked about address and data buses, as well as read and write lines. All this is connected either with RAM ( RAM) or from ROM (read only memory or read only memory, ROM) - as a rule, with both. In our microprocessor example, we have a wide 8 bit address bus and an equally wide data bus — also 8 bits. This means that the microprocessor can access 2 ^ 8 to 256 bytes of memory, and can read and write 8 bits of memory at a time. Let's assume that this simple microprocessor has 128 bytes of internal memory starting at address 0 and 128 bytes of RAM starting at address 128.

Random access memory stands for read-only memory. The read-only memory chip is programmed with permanent preset bytes. The bus address tells the RAM chip which byte to get to and fit on the data bus. When the read line changes state, the read-only memory chip presents the selected byte to the data bus.

RAM stands for RAM, lol. RAM contains a byte of information, and the microprocessor can read or write to these bytes depending on whether the read or write line is signaling. One of the problems that can be found in today's chips is that they forget everything as soon as the energy is gone. Therefore, the computer must have RAM.



RAM chip or read-only memory (ROM) chip

By the way, almost all computers contain some amount of RAM. On a personal computer, read-only memory is called BIOS (Basic Input / Output System). At startup, the microprocessor begins to execute instructions that it finds in the BIOS. BIOS instructions, by the way, they also fulfill their roles: they check the hardware, and then all the information goes to HDD to create a boot sector. The boot sector is one small program, and the BIOS stores it in memory after reading it from disk. Then the microprocessor starts to execute instructions boot sector from RAM. The boot sector program will show the microprocessor what else to take with it. hard disk into RAM, and then does it all, and so on. This is how the microprocessor loads and runs the entire operating system.

Microprocessor instructions

Even the incredibly simple microprocessor I have just described will have enough large set instructions that he can follow. The collection of instructions is implemented as bit patterns, each of which has different meaning when barred into the command sector. People are not particularly good at remembering bit patterns as they are a collection of short words. By the way, this set of short words is called the processor's assembly language. An assembler can translate words into a bit pattern very easily, and then the assembler's efforts will be put into memory for the microprocessor for execution.

Here is a set of assembly language instructions:

  • LOADA mem- load into register with memory address
  • LOADB mem- load into register B from memory address
  • CONB mem- load constant value into register B
  • SAVEB mem- save register B to memory address
  • SAVEC mem- save register C to memory address
  • ADD- add A and B and save the result to C
  • SUB- subtract A and B and store the result in C
  • MUL- multiply A and B and store the result in C
  • DIV- split A and B and store the result in C
  • COM- compare A and B and save the result in the test
  • JUMP addr- go to the address
  • JEQ addr- go, if equal, to solve
  • JNEQ addr- go, if not equal, to solve
  • JG addr- go, if more, for solution
  • JGE addr- go if greater or equal to solve
  • JL addr- go, if less, to solve
  • Jle addr- go if less or equal to solve
  • STOP- stop execution
Assembly language
The C compiler translates this C code into assembly language. Assuming that RAM starts at address 128 in this processor, and read-only memory (which contains the assembly language program) starts at address 0, then for our simple microprocessor, the assembler might look like this:

// Assume a is at address 128 // Assume F is at address 1290 CONB 1 // a = 1; 1 SAVEB 1282 CONB 1 // f = 1; 3 SAVEB 1294 LOADA 128 // if a> 5 the jump to 175 CONB 56 COM7 JG 178 LOADA 129 // f = f * a; 9 LOADB 12810 MUL11 SAVEC 12912 LOADA 128 // a = a + 1; 13 CONB 114 ADD15 SAVEC 12816 JUMP 4 // loop back to if17 STOP

Read only memory (ROM)
So the question now is, "How do all of these instructions integrate with read-only memory?" I'll explain, of course: each of these instructions in assembly language must be represented as a binary number. For simplicity, let's assume that each assembly language instruction assigns itself a unique number. For example, it would look like this:

  • LOADA - 1
  • LOADB - 2
  • CONB - 3
  • SAVEB - 4
  • SAVEC mem - 5
  • ADD - 6
  • SUB - 7
  • MUL - 8
  • DIV - 9
  • COM - 10
  • JUMP addr - 11
  • JEQ addr - 12
  • JNEQ addr - 13
  • JG addr - 14
  • JGE addr - 15
  • JL addr - 16
  • Jle addr - 17
  • STOP - 18
These numbers will be known as opcodes. In read-only memory, our little program will look like this:

// Assume a is at address 128 // Assume F is at address 129Addr opcode / value0 3 // CONB 11 12 4 // SAVEB 1283 1284 3 // CONB 15 16 4 // SAVEB 1297 1298 1 // LOADA 1289 12810 3 // CONB 511 512 10 // COM13 14 // JG 1714 3115 1 // LOADA 12916 12917 2 // LOADB 12818 12819 8 // MUL20 5 // SAVEC 12921 12922 1 // LOADA 12823 12824 3 // CONB 125 126 6 // ADD27 5 // SAVEC 12828 12829 11 // JUMP 430 831 18 // STOP

You can see that 7 lines of C code became 18 lines of assembler, and that all became 32 bytes in read only memory.

Decoding
The decode instruction must turn each of the opcodes into a set of signals that will drive various components inside the microprocessor. Let's take the ADD instructions as an example and see what it has to do. So:

  • 1. In the first clock cycle it is necessary to load the instruction itself, therefore the decoder needs to: activate the buffer for the command counter by three states, activate the read line (RD), activate the data in the three states of the buffer in the command register
  • 2. In the second clock cycle, the ADD instruction is decoded. There is very little to do here: set the arithmetic logic unit (ALU) operation to register C
  • 3. During the third cycle, the program counter increases (in theory, this can overlap in the second cycle)
Each instruction can be broken down into a set of sequenced operations - such as we just looked at. They manipulate the components of the microprocessor in the correct order. Some instructions, such as the ADD instruction, may take two to three clock cycles. Others may take five or six measures.

Let's come to the end


The number of transistors has a huge impact on processor performance. As you can see above, a typical Intel 8088 microprocessor could complete 15 cycles. The more transistors, the higher the performance - it's simple. The large number of transistors also allows technology such as pipelining.

The pipeline architecture is made up of command execution. It can take five cycles to execute one instruction, but there cannot be five instructions at different stages of execution at the same time. So it looks like one instruction completes each clock cycle.

All of these trends are allowing the number of transistors to grow, resulting in the multi-million dollar transistor heavyweights that are available today. Such processors can perform about a billion operations per second - just imagine. By the way, now many manufacturers have become interested in the release of 64-bit mobile processors and obviously another wave is coming, only this time 64-bit architecture is the king of fashion. Maybe I'll get to this topic in the near future and tell you how it actually works. This, perhaps, is all for today. I hope you found it interesting and learned a lot.

Nowadays, even less advanced mobile phones do not do without a microprocessor, let alone tablet, laptop and desktop personal computers. What is a microprocessor and how did the history of its creation develop? In plain language, a microprocessor is a more complex and multifunctional integrated circuit.

The history of the microcircuit (integrated circuit) begins since 1958, when an employee of the American firm Texas Instruments Jack Kilby invented a certain semiconductor device containing several transistors in one case, connected by conductors. The first microcircuit - the progenitor of the microprocessor - contained only 6 transistors and was a thin plate made of germanium with gold tracks applied to it. All this was located on a glass substrate. For comparison, today there are only a few units and even tens of millions of semiconductor elements.

By 1970 quite a lot of manufacturers were engaged in the development and creation of integrated circuits of various capacities and different functional orientations. But this very year can be considered the date of birth of the first microprocessor. It was in this year that Intel created a memory chip with a capacity of only 1 Kbit - negligible for modern processors, but incredibly large for that time. At that time it was a huge achievement - the memory chip was capable of storing up to 128 bytes of information - much higher than similar analogs. In addition, at about the same time, the Japanese calculator manufacturer Busicom ordered the same Intel 12 microcircuits of various functional orientations. Intel specialists managed to implement all 12 functional directions in one microcircuit. Moreover, the created microcircuit turned out to be multifunctional, since it made it possible to programmatically change its functions without changing the physical structure. The microcircuit performed certain functions depending on the commands supplied to its control outputs.

Within a year in 1971 Intel releases the first 4-bit microprocessor, codenamed 4004. Compared to the first 6-transistor microcircuit, it contained as many as 2.3 thousand semiconductor elements and performed 60 thousand operations per second. At that time, it was a huge breakthrough in the field of microelectronics. 4-bit meant that the 4004 could handle 4-bit data at once. Two more years later in 1973 the company produces an 8-bit 8008 processor, which already worked with 8-bit data. Starting out since 1976, the company begins to develop already a 16-bit version of the 8086 microprocessor. It was he who began to be used in the first personal computers of IBM and, in fact, laid one of the building blocks in the history of computers.

Microprocessor types

By the nature of the executable code and the organization of the control device, several types of architectures are distinguished:

    A processor with a complex instruction set. This architecture is characterized by a large number of complex instructions and, as a consequence, a complex control device. Early versions of CISC processors and embedded processors have long instruction execution times (from a few clock cycles to hundreds), determined by the microcode of the control device. High-performance superscalar processors are characterized by deep program analysis and out-of-order execution of operations.

    A processor with a simplified instruction set. This architecture has a much simpler control device. Most of the RISC processor instructions contain the same small number of operations (1, sometimes 2-3), and the command words themselves in the overwhelming majority of cases have the same width (PowerPC, ARM), although there are exceptions (Coldfire). Superscalar processors have the simplest grouping of instructions without changing the execution order.

    Explicitly parallel processor. It differs from others primarily in that the sequence and parallelism of the execution of operations and their distribution among functional devices are clearly defined by the program. Such processors can have a large number of functional devices without much complication of the control device and loss of efficiency. Typically, such processors use a wide control word consisting of several syllables that determine the behavior of each functional device during a clock cycle.

    Minimum instruction set processor. This architecture is primarily determined by a very small number of instructions (several dozen), and almost all of them are null-operands. This approach makes it possible to pack the code very tightly, allocating from 5 to 8 bits for one instruction. Intermediate data in such a processor is usually stored on the internal stack, and operations are performed on the values ​​at the top of the stack. This architecture is closely related to the ideology of programming in the Forth language and is usually used to execute programs written in this language.

    Variable instruction set processor. An architecture that allows you to reprogram yourself by changing the set of instructions, adjusting it to the task at hand.

    Transport-driven processor. The architecture originally branched off from EPIC, but fundamentally different from the others in that the instructions of such a processor encode functional operations, and the so-called transports encode data transfers between functional devices and memory in an arbitrary order.

According to the way of storing programs, two architectures stand out:

    Von Neumann architecture... Processors of this architecture use one bus and one I / O device to access program and data.

    Harvard architecture. In processors of this architecture, there are separate buses and I / O devices for fetching programs and exchanging data. In embedded microprocessors, microcontrollers, and DSPs, this also defines the existence of two independent memories for storing programs and data. In central processing units, this determines the existence of a separate instruction and data cache. Behind the cache, buses can be combined into one by multiplexing.

It's hard to imagine human life without modern electronics. Of course, there are many places where o modern technologies still have not heard, not to use. But still, the overwhelming part of the world's population is somehow connected with electronics, which has become an integral part of our life and work.

Since ancient times, man has used various devices in order to make some production processes more efficient or to make his own existence more comfortable. The real breakthrough happened in the late 40s of the 20th century, when transistors were invented. The first were bipolar transistors, still used today. They were followed by MOSFETs (metal oxide semiconductor).

The first transistors of this type were more expensive and less reliable than their bipolar cousins. But, starting in 1964, electronics began to use integrated circuits, which are based on MOS transistors. This subsequently allowed to reduce the cost of production. electronic devices and significantly reduce the size of gadgets and systems while reducing power consumption. Over time, microcircuits became more complex and sophisticated, replacing large blocks of transistors, which opened up the possibility of reducing the size of electronic devices.

By the end of the 60s, microcircuits began to spread with a fairly a large number logic gates (large for that time): 100 and more. This made it possible to use new elements to create computers. Developers of electronic computers relatively quickly recognized that increasing the density of transistors in a microcircuit would ultimately allow a computer processor to be created in a single chip. Initially, integrated circuits with MOS transistors were used to create terminals, calculators, and they were used by developers of on-board systems for passenger and military transport.

Key moment

Today, most electronics specialists admit that the start of a qualitatively new stage in the development of electronics began in 1971, when a 4-bit 4004 processor from Intel appeared, later replaced by an 8-bit 8008 chip. It appeared after a small Japanese company called Nippon Calculating Machine, Ltd. (later Busicom Corp.) ordered a total of 12 chips from Intel. The company needed these microcircuits for its calculators, and the logical design of the chips was developed by an employee of the customer company). At that time, a new set of microcircuits was developed for each device, performing highly specialized functions.

When fulfilling the order, Martian Edward Hoff proposed to reduce the number of microcircuits for the new device of the Japanese company by introducing the use of a central processor. It was he, according to the idea of ​​the engineer, who was supposed to become a data processing center and perform arithmetic and logical functions. The processor had to replace several microcircuits at once. The management of both companies approved of this idea. In the fall of 1969, Hoff, with the help of Stanley Maysor, proposed a new architecture of microcircuits, the number of which was reduced to only 4. Some of the proposed elements are a 4-bit central processor, ROM and RAM.

The processor itself was developed by Federico Fagin, an Italian physicist who became the chief designer of the MCS-4 family at Intel. It was he who, thanks to his knowledge of MOS technology, was able to create a processor, implementing Hoff's idea. By the way, the world's first commercial microcircuit using silicon gate technology was also developed by him. She was called Fairchild 3708.

Fagin, as an Intel employee, was able to create a new method for designing arbitrary logic systems. He was assisted in his work by Masatoshi Shima, who was an engineer at Busicom at the time. Fagin and Sima subsequently developed the Zilog Z80 microprocessor, which, by the way, is still being produced.


Intel 4004 processor architecture

But the main thing happened on November 15, 1971. This is the date of the appearance of the first microprocessor from Intel, the 4004 chip. Its cost at that time was $ 200. Almost all the functions of a mainframe processor were implemented on a single die. It was announced in November 1971 in Electronic News magazine.

Processor specifications:


  • Date of appearance: November 15, 1971
  • Number of transistors: 2300
  • Crystal area: 12 mm²
  • Process technology: 10 μm (P-channel silicon pie MOS technology)
  • Clock frequency: 740 kHz (specifically from 500 to 740.740 ... kHz, since clock period 2..1.35 μs (or 92.6 kHz?)
  • Width of registers: 4 bits
  • Number of registers: 16 (16 four-bit can be used as 8 eight-bit)
  • Number of ports: 16 four-bit input and 16 four-bit output
  • Data bus width: 4 bits
  • Width of the address bus: 12 bits
  • Harvard architecture
  • Stack: internal 3-tier
  • Command memory (ROM / ROM): 4 kilobytes (32768 bits)
  • The amount of addressable memory (RAM / RAM): 640 bytes (5120 bits)
  • Number of instructions: 46 (of which 41 are 8-bit and 5 are 16-bit)
  • Instruction cycle: 10.8 microseconds
  • Supply voltage: −15 V (pMOS)
  • Working temperature: 0 to + 70C
  • Storage and operating conditions: from -40 to + 85C
  • Connector: DIP16 (the microcircuit was directly soldered into printed circuit board or installed in a special slot)
  • Case: 16-pin DIP (1 plastic or 3 ceramic, e.g. C4004 (white ceramic with gray stripes), C4004 (white ceramic), D4004 (black and gray ceramic), P4004 (black plastic))
  • Delivery type: separately and in MCS-4 sets (ROM, RAM, I / O, CPU)
This processor executed 60,000 to 93,000 instructions per second. At the same time, one of the first electronic computers, ENIAC, could only execute 5,000 instructions per second. At the same time, ENIAC occupied 280 square meters, weighed 27 tons and consumed 174 kW of energy.

The 4004 processor did not become very popular. The 8080th chip, which can be called the "great-grandson" of the 4004, began to be used everywhere.

Calculators and computers

In 1971 at Intel there were competitors. For example, Mostek, a company that developed semiconductor devices and devices based on them, created the world's first "calculator on a chip", the MK6010.

In June 1971, Texas Instruments launched a media campaign highlighting the benefits of its processor. At the time, the TMX 1795-based Datapoint 2200 was described as a "powerful computer superior to the original", which meant that the TMX 1795-based Datapoint 2200 was vastly superior to the bipolar-transistor-based Datapoint 2200. But STS, after testing the new chip, rejected it, continuing to use bipolar chips. Intel was still working on its own processor.

After some time, TI, having made sure that there was no demand for TMX 1795 (later - TMC 1795), ended the media campaign and stopped production of the system. But this particular chip went down in history as the first 8-bit processor.

In 1971, STS lost interest in a single processor for its systems, transferring all rights to the new Intel chip. The company did not give up this opportunity, and continued to develop the 8008 chip, successfully offering it to a number of other companies. In April 1972, she delivered hundreds of thousands of such processors. Two years later, the 8008 processor was replaced by the new 8080, after which the 8086 came and the era of systems on the x86 architecture began. Now, when working on a powerful PC or laptop, it is worth remembering that the architecture of such a system was developed many years ago for the Datapoint 2200 HMI.

Intel then used more advanced technology, which provided the advantage of its processors. They were fast and relatively energy efficient. Plus, Intel chips had a higher transistor density than the TI chip, which allowed for smaller processors. Plus, marketing also played an important role, in this area Intel also made a number of successful steps, which ensured the popularity of the company's developments.

Be that as it may, the situation with the leadership in the development of the first processors is far from being as unambiguous as it is commonly believed. There were several pioneers here at once, but later the development of only one of them became popular. Actually, we are all dealing with modernized “descendants” of this technology today, in the 21st century.

Tags: Add Tags

Introduction

1 Development of microprocessors

2 Microprocessors i80386

3 Microprocessors i80486

4 Pentium processors

5 Processor performance

6 Coprocessors

Bibliography


Introduction

The most important element of any PC is the microprocessor. It largely determines the capabilities of a computing system. The first i4004 microprocessor was manufactured in 1971 and since then Intel has firmly held the leading position in the market segment. The most successful development project is the i8080. It was on it that the Altair computer was based, for which B. Gates wrote his first Basic interpreter. The classic architecture of the i8080 has had a huge impact on further development single-chip microprocessors. The i8088 microprocessor, which was announced by Intel in June 1979, became the true industry standard for PCs. In 1981, the "blue giant" (IBM) chose this processor for their PC. Initially, the i8088 microprocessor operated at 4.77 MHz and had a speed of about 0.33 Mops, but then its clones were developed, designed for a higher clock frequency of 8 MHz. The i8086 microprocessor appeared exactly one year earlier, in July 1978, and became popular thanks to the CompaqDecPro computer. Building on the i8086 architecture and market demand, Intel released the i80286 in February 1982. It appeared at the same time as the new IBM PC AT computer. Along with the increase in performance, it had a protected mode (used a more sophisticated memory management technique). Protected Mode allowed programs like Windows 3.0 and OS / 2 to run with more than 1MB of RAM. Thanks to 16-bit data on the new system bus, 2-byte messages can be exchanged with the control panel. The new microprocessor made it possible to access 16MB of RAM in a protected mode. The i80286 processor introduces multitasking and control for the first time at the chip level virtual memory... With a clock frequency of 8 MHz, a performance of 1.2 Mips was achieved.

1 Development of microprocessors

Computers have become widespread since the 50s. Previously, these were very large and expensive devices used only in government institutions and large firms. The size and shape of digital computers has changed beyond recognition as a result of the development of new devices called microprocessors.

A microprocessor (MP) is a software-controlled electronic digital device designed to process digital information and control the process of this processing, made on one or more integrated circuits with a high degree of integration of electronic elements.

In 1970, Martian Edward Hoff of Intel designed an integrated circuit similar in function to the central processing unit of a mainframe - the first Intel-4004 microprocessor, which was released on the market in 1971.

It was a real breakthrough, because the Intel-4004 MP less than 3 cm in size was more productive than the giant ENIAC machine. True, it worked much slower and could process only 4 bits of information at a time (mainframe processors processed 16 or 32 bits at the same time), but the cost of the first MP was tens of thousands of times cheaper.

The crystal was a 4-bit processor with a classical computer architecture of the Harvard type and was manufactured according to the advanced p-channel MOS technology with a design standard of 10 microns. Electrical diagram the device consisted of 2300 transistors. The MP worked at a clock frequency of 750 kHz with a command cycle duration of 10.8 μs. The i4004 chip had an address stack (an instruction counter and three LIFO-type stack registers), a RON block (RAM registers or a register file - RF), a 4-bit parallel ALU, a battery, a command register with a command decoder and a control circuit, as well as a communication circuit with external devices... All these functional units were united by a 4-bit stepper motor. The instruction memory reached 4 KB (for comparison: the size of the memory of a mini-computer in the early 70s rarely exceeded 16 KB), and the RF CPU consisted of 16 4-bit registers, which could be used as 8 8-bit registers. Such an organization of RONs was retained in subsequent MPs from Intel. Three stack registers provided three levels of subroutine nesting. MP i4004 was mounted in a plastic or sintered body of DIP (Dual In-line Package) type with only 16 pins. His command system consisted of only 46 instructions.

At the same time, the crystal had very limited input / output facilities, and the command system did not contain logical data processing operations (AND, OR, EXCLUSIVE OR), and therefore they had to be implemented using special subroutines. The i4004 module did not have the ability to stop (HALT commands) and handle interrupts.

The processor instruction cycle consisted of 8 clock cycles of the master oscillator. There was a multiplexed ША (address bus) / ШД (data bus), a 12-bit address was transmitted by 4-bits.

On April 1, 1972, Intel began shipping the industry's first 8-bit i8008. The crystal was manufactured using p-channel MOS technology with a design standard of 10 microns and contained 3500 transistors. The processor operated at a frequency of 500 kHz with a machine cycle duration of 20 μs (10 master oscillator periods).

Unlike its predecessors, the MP had a Princeton-type computer architecture, and as a memory it allowed the use of a combination of ROM and RAM.

Compared to i4004, the number of RON decreased from 16 to 8, and two registers were used to store the address for indirect memory addressing (technology limitation - the RON block, similar to crystals 4004 and 4040 in MP 8008, was implemented in the form of dynamic memory). The duration of the machine cycle has been almost halved (from 8 to 5 states). To synchronize work with slow devices, the READY signal was introduced.

The command system consisted of 65 instructions. The MP could address 16KB memory. Its performance in comparison with four-bit MP has increased by 2.3 times. On average, about 20 medium integration circuits were required to interface the processor with memory and I / O devices.

Possibilities p-channel technology for the creation of complex high-performance MPs were almost exhausted, so the "direction of the main impact" was transferred to the n-channel MOS technology.

On April 1, 1974, the Intel 8080 MP was presented to the attention of all interested parties. Thanks to the use of p-MOS technology with a design standard of 6 microns, it was possible to place 6 thousand transistors on the chip. The processor clock speed was increased to 2 MHz, and the instruction cycle time was already 2 μs. The amount of memory addressed by the processor has been increased to 64KB.

Due to the use of a 40-pin package, it was possible to separate the ША and ШД, the total number of microcircuits required to build a system in the minimum configuration was reduced to 6.

In the Russian Federation, a stack pointer was introduced, which is actively used in interrupt handling, as well as two programmatically inaccessible registers for internal transfers. The RON block was implemented on static memory microcircuits. The exclusion of the battery from the Russian Federation and its introduction into the ALU simplified the control circuit of the internal bus.

New in MT architecture is the use of a multilevel vector interrupt system. This technical solution made it possible to bring the total number of interrupt sources to 256 (before the appearance of LSI interrupt controllers, the interrupt vector generation scheme required the use of up to 10 additional chips of medium integration). The i8080 introduces a direct memory access (DMA) mechanism (as earlier in the IBM System 360 mainframes, etc.).

The PDP opened the green light for the use of such complex devices as magnetic disk drives and tape displays on CRTs in microcomputers, which turned the microcomputer into a full-fledged computing system.

The company's tradition, starting with the first crystal, has become not a separate CPU chip, but a family of LSIs designed for joint use.

Modern microprocessors are built on 32-bit x86 or IA-32 (Intel Architecture 32 bit) architecture, but very soon there will be a transition to a more advanced, efficient 64-bit architecture IA-64 (Intel Architecture 64 bit). In fact, the transition has already begun, this is evidenced by the mass release and release on sale in 2003 of the new Athlon 64 microprocessor from AMD (Advanced Micro Devices), this microprocessor is notable for the fact that it can work with both 32-bit applications and 64-bit applications. bit. The performance of 64-bit microprocessors is much higher.

2 Microprocessors i80386

In October 1985, Intel announced the first 32-bit microprocessor, the i80386. The first computer to use this microprocessor was the CompaqDeskPro 386. The full 32-bit architecture in the new microprocessor was complemented by an extended memory management unit, which in addition to a segmentation unit was supplemented with a page control unit. This device makes it easy to rearrange segments from one memory location to another. At a clock frequency of 16 MHz, the speed was 6 Mips. 32-address lines made it possible to physically address 4Gb of memory, in addition, a new V86 virtual memory management mode was introduced. In this mode, several tasks for the i8086 could run simultaneously.

The i80386 microprocessor, made on 1 die with a coprocessor, was called the i80386DX. A cheaper model of 32-bit microprocessor appeared only in July 1988 (i80386SX). The new microprocessor used a 16-bit data bus and a 24-bit address bus. This was especially handy for the standard IBM PC AT. The software written for the i80386DX ran on the i80386DX. The internal registers were completely identical. The SX index comes from the word "sixteen" (16-bit data bus). For the i486, SX has come to mean no coprocessor. At the fall trade show in 1989, Intel announced the i80486DX, which contained 1.2 million transistors on a single die and was fully compatible with the rest of the 86 processors. For the first time, the new microcircuits combined the CPU, coprocessor and cache memory on one die. Use of the pipelined architecture inherent in RISC processors, allowing to achieve 4x the performance of conventional 32-bit systems. 8KB of on-board Cache accelerated execution due to intermediate storage of frequently used instructions and data. At a clock frequency of 25 MHz, the microprocessor had a performance of 16.5 Mips. Established in January 1991 the microprocessor version with a clock frequency of 50 MHz allowed to increase the performance by another 50%. The built-in coprocessor significantly accelerated mathematical calculations, but later it became clear that only 30% of users needed such a microprocessor.

The first microprocessors with four bits (bits) consisted of one crystal.

The first microprocessors were based on p - MOS circuits. Modern microprocessors are executed on and - MOS circuits with low cost and medium speed, on extremely low-power CMOS circuits and on TTL circuits with high speed.

The first microprocessors (MP) appeared in the early 70s as a result of the joint efforts of systems engineers, problem solvers architectural organization of computer facilities, and circuit engineers involved in design and production technology radio electronic means.

The first microprocessor, the 4-bit Intel 404, entered the unprepared market in 1971. The 4004 MP, designed to meet the needs of calculator manufacturers, presented itself to the world as a sign of a new era in integrated electronics.

The earliest microprocessors used a method of memory management known as purely machine memory.

It is worth recalling that the first microprocessors imported to Japan in 1971 cost about a thousand dollars.

For more than 30 years that have passed since the appearance of the first microprocessors, certain exchange rules have been developed, which are followed by the developers of new microprocessor systems. These rules are not too complicated, but it is necessary to know firmly and strictly follow them for successful work.

Operating systems are created for any type of microprocessor based on the instruction set that is put into the microprocessor during development. The first microprocessor was created by Intel, the leading chip manufacturer.

Can any technical achievement of the computer age rival the microprocessor in importance? The first microprocessors, whose short history began just a decade ago, were based mainly on the achievements of microelectronics, a technology that arose much later than the appearance of computers themselves and to a large extent independently of them. From the outset, microprocessor designers and manufacturers were met with overwhelming acclaim once they were able to demonstrate that each new development was one step closer in structure to the modern medium or large computing machine. Observers easily concluded that if mounting density, speed, and automated design continued to rise as expected, microprocessors would soon be on par in power and logic with large minicomputers, and possibly large computers.

In 1970, another was made important step on the way to a personal computer - Marshian Edward Hoff of Intel designed an integrated circuit similar in function to a central processing unit big computer... This is how the first microprocessor Intel-4004 appeared (see the picture on the right), which was released for sale in 1971. It was a real breakthrough, because the microprocessor Intel-4004 with a size of less than 3 cm was more productive than the giant ENIAC machine. True, the capabilities of Intel-4004 were much more modest than those of the central processor of large computers of that time - it worked much slower and could process only 4 bits of information simultaneously (processors of large computers processed 16 or 32 bits simultaneously), but it also cost tens of thousand times cheaper.

The creation of an operating system such as PC-DOS is neither a matter of chance nor the result of purely technocratic planning. Economic competition has long led to the emergence of operating systems for mainframes even before the first microprocessors.

It is a single microcircuit that controls everything that happens in the PC. This microcircuit operates at a certain clock frequency, measured by a certain number of megahertz. By today's standards, the first microprocessors (8088 or 80286) were terribly slow and would not be able to handle modern software.

Redesigning a large-scale IC whenever a company wants to update its product range, which happens very often, is truly a colossal job. The microprocessor was born thanks to an idea put forward by specialists from Bizicom: it is necessary to CKOEI-structure such an integrated circuit that can be easily adapted to any new product mastered by their company. Alas, then Japan was still too weak in the field of R&D; so the United States was able to grab the ball and run away by creating the first microprocessor.

However, Intel continued to stick with the prototype, which had already spent development funds. Thus, the well-known Intel 8008 MP became the first microprocessor on the world market.

Who and when invented the world's first microprocessor

Every Intel employee knows who invented the microprocessor. In 1969, Japanese developers who had previously been involved in designing calculators came to work in this, then not yet known, firm. Engineers used twelve integrated circuits to create a common desktop computer. Masatoshi Shima played the main role in this project. At the time, Ted Hofsor was managing one of Intel's departments. He, as the future creator of the microprocessor, realized instead of a calculator with the ability to program, it would be better to make a computer that would program the work of the calculator.

The creation of the first processor in the world began with the development of its architecture. In 1969, an Intel employee suggested that the first series of microprocessors be named the 4000 family. Each model in the family had sixteen output microcircuits. This helps to understand what the first microprocessor was. Model 4001 had 2 KB memory. The 4003 had a ten-bit expander with keyboard connectivity and various indicators. And version 4004 was already a four-bit processor device. Many believe that it was the very first microprocessor. In the 4004 model, two thousand three hundred transistors worked. The device operated at a frequency of 108 kHz.

Today you can find different opinions about when the first processor was created, however, most believe that November 15, 1971 is the date and year of the creation of the first microprocessor in the world. Initially, this development was bought by the Japanese company Busicom for sixty thousand dollars, but Intel later returned the money to remain the only copyright holders of the invention.

The first processor was used in traffic control systems, in particular in traffic lights. In addition, the device was used in blood analyzers. A little later, 4004 found a place in the Pioneer 10 space probe, which was launched in 1972.

The first domestic microprocessor was created in the early seventies at the Special Computing Center under the leadership of D.I. Yuditsky.

Thus, in the 70s, microprocessors began to gradually penetrate into various areas of human activity. All processors were later divided into microprocessors and microcontrollers directly. The former are used in personal computers, and microcontrollers have found application in control different systems... They have a weaker computing core, but there are many additional nodes. Microcontrollers are sometimes called micro-computers, since all nodes and modules are located directly on the chip.

Intel 4004- 4-bit microprocessor designed by Intel Corp. and released on November 15, 1971. This microcircuit is considered the world's first commercially available single-chip microprocessor. However, in 1970, more than a year before the release of the i4004 chip, the military microprocessor F14 CADC (en) was manufactured, which was classified until 1998.

1969 a small Japanese company Nippon Calculating Machine, Ltd.(later Busicom Corp.), a calculator manufacturer, ordered 12 chips from Intel (the logic design of the system was developed by Busicom employee Masatoshi Shima (嶋 正 利)) to be used in a new desktop calculator. These microcircuits were always characterized by highly specialized functions and were designed to perform strictly defined work, so for each new application, the entire chipset had to be redesigned. Intel employees found this approach unprofitable. Marcian Edward (Ted) Hoff, 32, is proposing to Intel and Busicom executives to reduce the number of chips by using a CPU that will need to do arithmetic and logic functions, one instead of several microcircuits. The idea was accepted "with a bang" by the management of both firms. During the fall of 1969, Ted Hoff, with the help of Stanley Mazor, proposed a new chip architecture, which was reduced to 4, including a central processing unit: a 4-bit central processing unit (CPU), ROM for storing software, and RAM for storing user data. ... The development of the microprocessor only began in April 1970 when Federico Faggin, an Italian physicist, joined Intel as the chief designer of the MCS-4 family. Fagin, thanks to his deep knowledge of the MOS silicon gate technology he developed at Fairchild in 1968, and the extensive experience gained in 1961 at the Italian firm Olivetti in the field of logical computer design, managed to reduce the CPU microprocessor into one single chip. In 1968, while at Fairchild, he also implemented the world's first commercial chip that used silicon gate technology: the Fairchild 3708. At Intel, Fagin developed a new, until then existing method design for logic microcircuits and contributed to many innovations in the design of processes and microcircuits, very important for the implementation of a microprocessor in a single chip. Masatoshi Shima, who worked as a software engineer for Busicom and had no experience in designing MOS devices, helped Fagin develop the MCS-4, and later worked with him at Zilog, founded in late 1974 by Fagin and Ralph Ungermann and completely dedicated to microprocessors. Fagin and Shima worked together to develop the Zilog Z80 microprocessor, which is still in production today.

The second digit denoted the type of product: 0 - processors, 1 - RAM chips, 2 - controllers, 3 - ROM chips, 4 - shift registers, 5 - EPLD chips, 6 - PROM chips, 7 - EPROM chips, 8 - surveillance chips and circuits synchronization in pulse generators, 9 - chips for telecommunications.

The third and fourth digits corresponded to the serial number of the product, and since the first processor required three more specialized microcircuits (ROM, RAM and I / O expander), which were released earlier than 4004, the microprocessor was named 4004.

On November 15, 1971, the 4004 microcircuit was released - the first microprocessor that, at a cost of $ 200, implemented all the functions of a mainframe processor on a single chip. The world's first microprocessor was announced in November 1971 in Electronic News magazine.

The 4004 microprocessor was housed in a 16-pin DIP package with a die size of 12 mm2 (3x4 mm). The processor could execute 60,000 (on average, up to a maximum of 93,000) instructions per second. (For comparison, one of the first fully electronic computers - the American ENIAC - executed only 5000 (maximum) instructions per second, occupied an area of ​​278.7 square meters and weighed 30 tons.) Intel foresaw the crucial importance of microprocessors in the miniaturization of computers. and therefore bought the copyright for the 4004 microprocessor and its improved versions from Busicom for $ 60,000.

However, in 1971, the processor did not become a hit of sales. Intel's strategy was to market the 4004 to expand the market for the much more popular 1101/1103 memory chips. Only the 8080 microprocessor, the electronic "great-grandson" 4004, began to enjoy well-deserved popularity.

Specialized microcircuits series 4xxx

The 4004 chip came with 3 ASICs: ROM, RAM and I / O Expander. And although these microcircuits had their own designation system (series 1xxx, 2xxx and 3xxx), they received a second name in the 4xxx category, which began to be designated next to their usual numbering.

  • 4001 *. 256-byte ROM (256 8-bit program instructions), and one built-in 4-bit I / O port.
  • 4002 ... 40-byte RAM (80 4-bit cells), and one built-in 4-bit output port; The RAM in the chip is organized into 4 "registers" of twenty 4-bit cells:
    • 16 data cells (used for mantissa digits in the original calculator)
    • 4 status cells (in the original calculator used for exponent digits and signs)
  • 4003 ... 10-bit "I / O expander" (shift register converting serial code to parallel)

In addition, the 4008 and 4009 chips were released in the 4xxx family, which could also be supplied with the 4004.

  • 4008 *. 8-bit address latch for accessing standard memory chips, and one built-in I / O port
  • 4009 *. converter of I / O access to standard memory and I / O chips

(*) Note: the 4001 chip could not be used in the system along with the 4008/4009 chip pair

The 400x family was also referred to as MCS-4(Micro Computer Set 4-bit).

Intel also sold Intellec-4(big blue boxes) - a software development and testing system for 4004. In fact, it was one of the first micro-computers built on the basis of the 4xxx series (4004, 4201, 4001x4 and 4002x2 chips). Only the high price (5 thousand dollars) did not allow calling it a personal computer.

The Intel 4004 is naturally one of the most popular collectible chips around. The most highly prized white and gold Intel chips 4004 with visible gray marks on the white part (original body type). So in 2004, such a microcircuit, on the on-line auction eBay, was estimated at about $ 400. A little less valuable are microcircuits without gray marks on the case, usually their cost is about $ 200- $ 300. Those chips (with gray traces) without a release date on the bottom of the chip were released earlier, so their value is higher.

Intel Launches Its First Microprocessor

The evolutionary process that led to modern microcomputers was extremely fast. Although the creation of the machine known as " Personal Computer”, A large number of discoveries and inventions were used, the event that became the most important milestone in the history of science should be mentioned.

Introduced on November 15, 1971, the Intel® 4004 microprocessor launched an electronics revolution that changed the world. Prior to the 4004, there were no programmable microprocessors on the market. They were the first processors to make software an important element in the design of microelectronics.

In 1969, Intel brought excitement to the electronics industry with 1Kbps ICs that were much larger than any other available at the time. Due to the company's success in developing microcircuits, the Japanese calculator firm Busicomp contacted it and offered to release 12 microcircuits for one of its calculators.

Intel engineers took the 12-chip design and combined all the desired features and capabilities into one generic multipurpose chip. This IC was different from previous designs, programmed for a single purpose using embedded instructions.

The concept was to design an almost complete computing device on a single chip. The four-bit Intel 4004 microprocessor has become just such a device. It was about the size of a fingernail, and had the same computing power as the first electronic computer ENIAC, created in 1946, occupying an entire room and using 18,000 vacuum tubes.

The Intel 4004 chip is one of the most popular in collectibles. The most highly prized Intel 4004 chips are white and gold, with visible gray marks on the white part. So, in 2004, such a microcircuit on the eBay online auction was estimated at about $ 400.

A little less valuable are microcircuits without gray marks on the case, usually their cost is about 200-300 dollars. Those chips without a release date on the bottom of the chip were released earlier, so their value is higher.

Intel began operations in July 1968. Its founders, engineers Gordon Moore and Robert Noyce, were previously employed by Fairchild. The specialists immediately outlined the main direction of work - to make memory based on semiconductors as accessible and practical as possible. At that time, memory of this type was many times more expensive than memory based on magnetic technologies. In this article, we will find out how the First Microprocessor was developed and who its creators were.

Later, Busicom (Japan) became interested in the company's activities, which entered into a contract with Intel for the development of microcircuits for a line of programmable calculators. In those years, such microcircuits were created immediately for specific devices.

The first microprocessor. History of creation

Initially, the project provided for at least 12 chips of different architectures. Intel engineers rejected this principle, proposing to create a universal device with a single chip that uses commands from RAM to operate. With just 4 chips (ROM, I / O controller, RAM and 4004 processor), program code could provide their work and perform various tasks... Since the module was versatile, it could be quickly adapted to work in other devices. (Learn more about the history of Intel processors)

In the spring of 1970, Intel hired engineer Frederico Faggin to design the 4004 control chip, the first microprocessor. Faggin also previously worked at Fairchild Semiconductor, where he invented silicon gate technology. These developments were used in the process of creating new microchips.

Initially, all rights to the new microcircuit belonged to Busicom. Faggin was confident that his invention would find widespread use in the future, so he convinced management to buy the rights to the chip. Busicom also had serious financial problems, so she agreed to compensation in the amount of $ 60,000.

On November 15, 1971, the 4004 chip (the very first microprocessor from intel) was officially announced, which was used in the MCS-4 microcomputer. The processor performance was only 108 kHz. To create the chip, 10-micron technology was used, which made it possible to accommodate 2300 transistors. It is worth noting that the performance was comparable to that of ENIAC (1946), which used 18,000 vacuum tubes and covered an area of ​​85 square meters.

Although the first microprocessor was intended for use in calculators, it later found use in other areas. For example, the chip has been used in medicine for blood tests, in traffic control systems, and even in the Pioner 10 space rocket developed by NASA for research.

Well, and for connoisseurs of English video about the 4004 processor

The world's first Intel 4004 microprocessor was created 40 years ago

Exactly 40 years ago, Intel released its first commercial microprocessor, the Intel 4004, which became the world's first microprocessor. It happened on November 15, 1971, but it all started in 1969, when the Japanese company Nippon Calculating Machine Corporation asked Intel to create 12 chips for the Busicom 141-PF calculator.