The birth of the Soviet missile defense system. Long road to integrated circuits

Table of contents:

The birth of the Soviet missile defense system. Long road to integrated circuits
The birth of the Soviet missile defense system. Long road to integrated circuits

Video: The birth of the Soviet missile defense system. Long road to integrated circuits

Video: The birth of the Soviet missile defense system. Long road to integrated circuits
Video: Russian IGLA Anti-Air Weapon Used by Both Sides in Ukraine 2024, March
Anonim

As for the first task - here, alas, as we mentioned in the previous article, there was no smell of standardization of computers in the USSR. This was the greatest scourge of Soviet computers (along with officials), which it was just as impossible to overcome. The idea of a standard is an often underestimated conceptual discovery of humanity, worthy of being on a par with the atomic bomb.

Standardization provides unification, pipelining, tremendous ease and cost to implement and maintain, and tremendous connectivity. All parts are interchangeable, machines can be stamped in tens of thousands, synergy sets in. This idea was applied 100 years earlier to firearms, 40 years earlier to cars - the results were breakthrough everywhere. It is all the more striking that it was only in the USA that it was thought of before applying it to computers. As a result, we ended up borrowing the IBM S / 360 and stole not the mainframe itself, not its architecture, not the breakthrough hardware. Absolutely all of this could easily be domestic, we had more than enough straight arms and bright minds, there were plenty of genius (and by Western standards, too) technologies and machines - series M Kartseva, Setun, MIR, you can list for a long time. Stealing S / 360, we, first of all, borrowed something that we did not have as a class in general all the years of development of electronic technologies up to that moment - the idea of a standard. This was the most valuable acquisition. And, unfortunately, the fatal lack of certain conceptual thinking outside of Marxism-Leninism and the "genius" Soviet management did not allow us to realize it in advance on our own.

However, we will talk about the S / 360 and the EU, this is a painful and important topic, which is also related to the development of military computers.

Standardization in computer technology was brought about by the oldest and greatest hardware company - of course, IBM. Until the mid-1950s, it was taken for granted that computers were built piece by piece or in small series of machines of 10-50, and no one guessed to make them compatible. That all changed when IBM, spurred on by its eternal rival UNIVAC (which was building the LARC supercomputer), decided to build the most complex, largest, and most powerful computer of the 1950s - the IBM 7030 Data Processing System, better known as Stretch. Despite the advanced element base (the machine was intended for the military and therefore IBM received a huge number of transistors from them), Stretch's complexity was prohibitive - it was necessary to develop and mount more than 30,000 boards with several dozen elements each.

Stretch was developed by such greats as Gene Amdahl (later S / 360 developer and founder of Amdahl Corporation), Frederick P. Brooks, Jr also S / 360 developer and author of software architecture concept) and Lyle Johnson (Lyle R. Johnson, author of the concept of computer architecture).

Despite the enormous power of the machine and a huge number of innovations, the commercial project completely failed - only 30% of the announced performance was achieved, and the president of the company, Thomas J. Watson Jr., proportionally reduced the price by 7030 several times, which led to large losses …

Later, the Stretch project was named by Jake Widman's Lessons Learned: IT's Biggest Project Failures, PC World, 10/09/08 as one of the top 10 IT industry management failures. Development leader Stephen Dunwell was punished for the commercial failure of Stretch, but soon after the phenomenal success of System / 360 in 1964 noted that most of its core ideas were first applied in the 7030. As a result, he was not only forgiven, but also in 1966, he was officially apologized and received the honorary position of IBM Fellow.

The 7030's technology was ahead of its time - instruction and operand prefetching, parallel arithmetic, protection, interleaving, and RAM write buffers, and even a limited form of re-sequencing called Instruction pre-execution - the grandfather of the same technology in Pentium processors. Moreover, the processor was pipelined, and the machine was able to transfer (using a special channel coprocessor) data from RAM to external devices directly, unloading the central processor. It was a kind of expensive version of DMA (direct memory access) technology that we use today, although Stretch channels were controlled by separate processors and had many times more functionality than modern poor implementations (and were much more expensive!). Later, this technology migrated to the S / 360.

The scope of the IBM 7030 was huge - the development of atomic bombs, meteorology, calculations for the Apollo program. Only Stretch could do all of this, thanks to its massive memory size and incredible processing speed. Up to six instructions could be executed on the fly in the indexing block, and up to five instructions could be loaded at once into the prefetch blocks and parallel ALU. Thus, at any given time, up to 11 commands could be at different stages of execution - if we ignore the outdated element base, then modern microprocessors are not far from this architecture. For example, Intel Haswell processes up to 15 different instructions per clock, which is just 4 more than the 1950s processor!

Ten systems were built, the Stretch program caused IBM 20 million in losses, but its technological legacy was so rich that it was immediately followed by commercial success. Despite its short life, the 7030 has brought many benefits, and architecturally it was one of the five most important machines in history.

Nevertheless, IBM saw the unfortunate Stretch as a failure, and it was because of this that the developers learned the main lesson - design of hardware was never an anarchic art anymore. It has become an exact science. As a result of their work, Johnson and Brooke wrote a fundamental book published in 1962, "Planning a Computer System: Project Stretch."

Computer design was divided into three classical levels: the development of a system of instructions, the development of a microarchitecture that implements this system, and the development of the system architecture of the machine as a whole. In addition, the book was the first to use the classic term "computer architecture". Methodologically, it was a priceless work, a bible for hardware designers, and a textbook for generations of engineers. The ideas outlined there have been applied by all computer corporations in the United States.

The tireless pioneer of cybernetics, the already mentioned Kitov (not only a phenomenally well-read person, like Berg, who constantly followed the Western press, but a true visionary), contributed to its publication in 1965 (Designing ultrafast systems: Stretch Complex; ed. By A. I. Kitova. - M.: Mir, 1965). The book was reduced in volume by almost a third and, despite the fact that Kitov especially noted the main architectural, systemic, logical and software principles of building computers in the extended preface, it passed almost unnoticed.

Finally, Stretch gave the world something new that had not yet been used in the computer industry - the idea of standardized modules, from which the entire industry of integrated circuit components later grew. Every person who goes to the store for a new NVIDIA video card, and then inserts it in place of the old ATI video card, and everything works without problems - at this moment, give a mental thanks to Johnson and Brook. These people invented something more revolutionary (and less noticeable and immediately appreciated, for example, the developers in the USSR did not even pay attention to it at all!) Than the conveyor and DMA.

They invented the standard compatible boards.

SMS

As we already said, the Stretch project had no analogues in terms of complexity. The giant machine was supposed to consist of over 170,000 transistors, not counting hundreds of thousands of other electronic components. All this had to be mounted somehow (remember how Yuditsky pacified the rebellious huge boards, breaking them into separate elementary devices - unfortunately, for the USSR this practice did not become generally accepted), debug, and then support, replacing faulty parts. As a result, the developers proposed an idea that was obvious from the height of our today's experience - first, develop individual small blocks, implement them on standard maps, then assemble a car from the maps.

Image
Image

This is how the SMS - Standard Modular System was born, which was used everywhere after Stretch.

It consisted of two components. The first was, in fact, the board itself with basic elements of 2, 5x4, 5 inches in size with a 16-pin gold-plated connector. There were single and double width boards. The second was a standard card rack, with the busbars spread out in the back.

Some types of card boards could be configured using a special jumper (just like motherboards are tuned now). This feature was intended to reduce the number of cards that the engineer had to take with him. However, the number of cards soon exceeded 2500 due to the implementation of many families of digital logic (ECL, RTL, DTL, etc.), as well as analog circuits for various systems. Nevertheless, SMS did their job.

They were used in all second-generation IBM machines and in numerous peripherals of third-generation machines, as well as served as a prototype for the more advanced S / 360 SLT modules. It was this "secret" weapon, which, however, no one in the USSR paid much attention to, and allowed IBM to increase the production of its machines to tens of thousands a year, as we mentioned in the previous article.

This technology was borrowed by all participants in the American computer race - from Sperry to Burroughs. Their total production volumes could not be compared with the fathers from IBM, but this made it possible in the period from 1953 to 1963 to simply fill up not only the American, but also the international market with computers of their own design, literally knocking out all regional manufacturers from there - from Bull to Olivetti. Nothing prevented the USSR from doing the same, at least with the CMEA countries, but, alas, before the EU series, the idea of a standard did not visit our state planning heads.

Compact packaging concept

The second pillar after standardization (which played a thousand times in the transition to integrated circuits and resulted in the development of the so-called libraries of standard logic gates, without any special changes used from the 1960s to the present day!) Was the concept of compact packaging, which was thought about even before integrated circuits. circuits and even to transistors.

The war for miniaturization can be divided into 4 stages. The first is pre-transistor, when lamps were tried to standardize and reduce. The second is the emergence and introduction of surface-mounted printed circuit boards. The third is the search for the most compact package of transistors, micromodules, thin-film and hybrid circuits - in general, the direct ancestors of ICs. And finally, the fourth is the ISs themselves. All these paths (with the exception of miniaturization of lamps) of the USSR passed in parallel with the USA.

The first combined electronic device was a kind of "integral lamp" Loewe 3NF, developed by the German company Loewe-Audion GmbH in 1926. This warm tube sound fanatic's dream consisted of three triode valves in one glass case, along with two capacitors and four resistors needed to create a complete radio receiver. Resistors and capacitors were sealed in their own glass tubes to prevent vacuum contamination. In fact, it was a "receiver-in-a-lamp" like a modern system-on-chip! The only thing that needed to be purchased to create a radio was a tuning coil and capacitor, and a loudspeaker.

However, this miracle of technology was not created in order to enter the era of integrated circuits a few decades earlier, but to evade German taxes levied on each lamp socket (the Weimar Republic luxury tax). Loewe receivers had only one connector, which gave their owners considerable monetary preferences. The idea was developed in the 2NF line (two tetrodes plus passive components) and the monstrous WG38 (two pentodes, a triode and passive components).

Image
Image

In general, lamps had tremendous potential for integration (although the cost and complexity of the design increased exorbitantly), the pinnacle of such technologies was the RCA Selectron. This monstrous lamp was developed under the direction of Jan Aleksander Rajchman (nicknamed Mr. Memory for the creation of 6 types of RAM from semiconductor to holographic).

John von Neumann

After the construction of ENIAC, John von Neumann went to the Institute for Advanced Study (IAS), where he was eager to continue work on a new important (he believed that computers are more important than atomic bombs for the victory over the USSR) scientific direction - computers. According to the idea of von Neumann, the architecture he designed (later called von Neumann) was supposed to become a reference for the design of machines in all universities and research centers in the United States (this is partly what happened, by the way) - again a desire for unification and simplification!

For the IAS machine, von Neumann needed memory. And RCA, the leading manufacturer of all vacuum devices in the United States in those years, generously offered to sponsor them with Williams tubes. It was hoped that by including them in the standard architecture, von Neumann would contribute to their proliferation as a RAM standard, which would bring enormous revenues to RCA in the future. In the IAS project, 40 kbit RAM was laid, the sponsors from RCA were a little saddened by such appetites and asked Reichman's department to reduce the number of pipes.

Raikhman, with the help of the Russian émigré Igor Grozdov (in general, many Russians worked at RCA, including the famous Zvorykin, and President David Sarnov himself was a Belarusian Jew - émigré) gave birth to a completely amazing solution - the crown of vacuum integrated technology, an RCA SB256 Selectron RAM lamp for 4 kbit! However, the technology turned out to be insanely complicated and expensive, even serial lamps cost about $ 500 apiece, the base, in general, was a monster with 31 contacts. As a result, the project did not find a buyer due to delays in the series - there was already a ferrite memory on the nose.

Image
Image

Tinkertoy project

Many computer manufacturers have made deliberate attempts to improve the architecture (you can't tell the topology here yet) of lamp modules in order to increase their compactness and ease of replacement.

The most successful attempt was the IBM 70xx series of standard lamp units. The pinnacle of lamp miniaturization was the first generation of the Project Tinkertoy program, named after the popular children's designer of 1910–1940.

Not everything goes smoothly for the Americans either, especially when the government gets involved in contracts. In 1950, the Navy's Bureau of Aeronautics commissioned the National Bureau of Standards (NBS) to develop an integrated computer-aided design and production system for modular-type universal electronic devices. In principle, at that time, this was justified, since no one yet knew where the transistor would lead and how to properly use it.

NBS poured more than $ 4.7 million into development (about $ 60 million by today's standards), enthusiastic articles were published in the June 1954 issue of Popular Mechanics and the May 1955 issue of Popular Electronics and … The project was blown away, leaving behind only a few technologies spraying, and a series of 1950s radar buoys made from these components.

What happened?

The idea was cool - to revolutionize the automation of production and turn huge blocks a la IBM 701 into compact and versatile modules. The only problem was that the entire project was designed for lamps, and by the time it was completed, the transistor had already begun its triumphant gait. They knew how to be late not only in the USSR - the Tinkertoy project absorbed huge sums and turned out to be completely useless.

Image
Image

Standard boards

The second approach to packaging was to optimize the placement of transistors and other discrete components on standard boards.

Until the mid-1940s, point-to-point construction was the only way to secure parts (by the way, well suited for power electronics and in this capacity today). This scheme was not automated and not very reliable.

Austrian engineer Paul Eisler invented the printed circuit board for his radio while working in Britain in 1936. In 1941, multilayer printed circuit boards were already used in German magnetic naval mines. The technology reached the United States in 1943 and was used in the Mk53 radio fuses. Printed circuit boards became available for commercial use in 1948, and automatic assembly processes (since the components were still attached to them in a hinged way) did not appear until 1956 (developed by the US Army Signal Corps).

Similar work, by the way, at the same time in Britain was carried out by the already mentioned Jeffrey Dahmer, the father of integrated circuits. The government accepted its printed circuit boards, but the microcircuits, as we remember, were shortsightedly hacked to death.

Until the late 1960s and the invention of planar housings and panel connectors for microcircuits, the pinnacle of early computer circuit board development was the so-called woodpile or cordwood packaging. It saves significant space and was often used where miniaturization was critical - in military products or supercomputers.

In the cordwood design, axial lead components were installed between two parallel boards and either soldered together with wire straps or connected with a thin nickel tape. To avoid short circuits, insulation cards were placed between the boards, and the perforation allowed the component leads to pass to the next layer.

The drawback of cordwood was that to ensure reliable welds, it was necessary to use special nickel-plated contacts, thermal expansion could distort the boards (which was observed in several modules of the Apollo computer), and in addition, this scheme reduced the maintainability of the unit to the level of a modern MacBook, but before the advent of integrated circuits, cordwood made it possible to achieve the highest possible density.

Image
Image
Image
Image

Naturally, the optimization ideas did not end on the boards.

And the first concepts for packaging transistors were born almost immediately after the start of their serial production. BSTJ Article 31: 3. May 1952: Present Status of Transistor Development. (Morton, J. A.) first described a study of "the feasibility of using transistors in miniature packaged circuits." Bell developed 7 types of integral packaging for its early M1752 types, each of which contained a board embedded in transparent plastic, but it did not go beyond prototypes.

In 1957, the US Army and NSA became interested in the idea a second time and commissioned Sylvania Electronic System to develop something like miniature sealed cordwood modules for use in secret military vehicles. The project was named FLYBALL 2, several standard modules containing NOR, XOR, etc. elements were developed. Created by Maurice I. Crystal, they were used in the cryptographic computers HY-2, KY-3, KY-8, KG-13 and KW-7. The KW-7, for example, consists of 12 plug-in cards, each of which can accommodate up to 21 FLYBALL modules, arranged in 3 rows of 7 modules each. The modules were multi-colored (20 types in total), each color was responsible for its function.

Image
Image

Similar blocks with the name Gretag-Bausteinsystem were produced by Gretag AG in Regensdorf (Switzerland).

Even earlier, in 1960, Philips manufactured similar Series-1, 40-Series and NORbit blocks as elements of programmable logic controllers to replace relays in industrial control systems; there was even a timer circuit in the series, similar to the famous 555 microcircuit. Modules were produced by Philips and their branches Mullard and Valvo (not to be confused with Volvo!) And were used in factory automation until the mid-1970s.

Even in Denmark, in the manufacture of the Electrologica X1 in 1958, miniature multi-colored modules were used, so similar to the Lego bricks loved by the Danes. In the GDR, at the Institute for Computing Machines at the Technical University of Dresden, in 1959, Professor Nikolaus Joachim Lehmann built about 10 miniature computers for his students, labeled D4a, they used a similar package of transistors.

The prospecting work proceeded continuously, from the late 1940s to the late 1950s. The problem was that no amount of corpding tricks could get around the tyranny of numbers, a term coined by Jack Morton, vice president of Bell Labs in his 1958 Proceedings of the IRE article.

The trouble is that the number of discrete components in the computer has reached the limit. Machines of more than 200,000 individual modules simply turned out to be inoperative - despite the fact that transistors, resistors and diodes at this time were already highly reliable. However, even the probability of failure in hundredths of a percent, multiplied by hundreds of thousands of parts, gave a significant chance that something would be broken in the computer at any given time. The wall-mounted installation, with literally miles of wiring and millions of solder contacts, made matters even worse. The IBM 7030 remained the limit of complexity of purely discrete machines, even the genius of Seymour Cray could not make the much more complex CDC 8600 work stably.

Hybrid chip concept

In the late 1940s, Central Radio Laboratories in the United States developed the so-called thick-film technology - traces and passive elements were applied to a ceramic substrate by a method similar to the manufacture of printed circuit boards, then open-frame transistors were soldered onto the substrate and all this was sealed.

This is how the concept of the so-called hybrid microcircuits was born.

In 1954, the Navy poured another $ 5 million into the continuation of the failed Tinkertoy program, the army added $ 26 million on top. The companies RCA and Motorola got down to business. The first improved the idea of CRL, developing it to the so-called thin-film microcircuits, the result of the work of the second was, among other things, the famous TO-3 package - we think anyone who has ever seen any electronics will immediately recognize these hefty rounds with ears. In 1955, Motorola released its first XN10 transistor in it, and the case was selected so that it would fit the mini-socket from the Tinkertoy tube, hence the recognizable shape. It also entered the free market and has been used since 1956 in car radios, and then everywhere, such cases are still used now.

The birth of the Soviet missile defense system. Long road to integrated circuits
The birth of the Soviet missile defense system. Long road to integrated circuits
Image
Image

By 1960, hybrids (in general, whatever they called them - micro-assemblies, micromodules, etc.) were steadily used by the US military in their projects, replacing the previous clumsy and hefty packages of transistors.

The finest hour of micromodules came already in 1963 - IBM also developed hybrid circuits for its S / 360 series (sold in a million copies, which founded a family of compatible machines, produced to date and copied (legally or not) everywhere - from Japan to the USSR). which they called SLT.

Integrated circuits were no longer a novelty, but IBM rightly feared for their quality, and was accustomed to having a complete production cycle in its hands. The bet was justified, the mainframe was not only successful, it came out as legendary as the IBM PC, and made the same revolution.

Naturally, in later models, such as the S / 370, the company has already switched to full-fledged microcircuits, albeit in the same branded aluminum boxes. SLT became a much larger and cheaper adaptation of tiny hybrid modules (only 7, 62x7, 62 mm in size), developed by them in 1961 for the IBM LVDC (ICBM on-board computer, as well as the Gemini program). What's funny is that the hybrid circuits worked there in conjunction with the already full-fledged integrated TI SN3xx.

Image
Image

However, flirting with thin-film technology, non-standard packages of microtransistors and others was initially a dead end - a half-measure that did not allow moving to a new quality level, making a real breakthrough.

And the breakthrough was to consist in a radical, by orders of magnitude, reduction in the number of discrete elements and compounds in a computer. What was needed was not tricky assemblies, but monolithic standard products replacing whole placers of boards.

The last attempt to squeeze something out of classical technology was the appeal to the so-called functional electronics - an attempt to develop monolithic semiconductor devices that replace not only vacuum diodes and triodes, but also more complex lamps - thyratrons and decatrons.

In 1952, Jewell James Ebers of Bell Labs created a four-layer "steroid" transistor - a thyristor, an analogue of a thyratron. Shockley in his laboratory in 1956 began work on fine-tuning the serial production of a four-layer diode - a dinistor, but his quarrelsome nature and beginning paranoia did not allow the case to be completed and ruined the group.

Works of 1955-1958 with germanium thyristor structures did not bring any results. In March 1958, RCA prematurely announced the Walmark ten-bit shift register as a "new concept in electronic technology," but actual germanium thyristor circuits were inoperable. In order to establish their mass production, exactly the same level of microelectronics was needed as for monolithic circuits.

Thyristors and dinistors found their application in technology, but not in computer technology, after the problems with their production were resolved by the advent of photolithography.

This bright thought was visited almost simultaneously by three people in the world. The Englishman Jeffrey Dahmer (but his own government let him down), the American Jack St. Clair Kilby (he was lucky for all three - the Nobel Prize for the creation of IP) and the Russian - Yuri Valentinovich Osokin (the result is a cross between Dahmer and Kilby: he was allowed to create a very successful microcircuit, but in the end they did not develop this direction).

We'll talk about the race for the first industrial IP and how the USSR almost seized priority in this area next time.

Recommended: