HomeSoftwareArchiveTop TipsGlossaryOther Stuff





Microchips, or integrated circuits as they are more correctly known are the basic building blocks of our technological society and very few of the electronic devices and systems we use today -- including camcorders and VCRs -- would be possible without them.



The integrated circuit or IC is a direct descendent of the point-contact transistor which was developed in 1948, in the Bell Telephone Laboratories by John Bardeen and Walter H Brattain, working under the direction of William Shockley. These three men are generally acknowledged to be the founding fathers of semiconductor electronics.



The conceptual origins of the integrated circuit can be traced back to a proposal made by G.W.A Dummer from the Royal Aircraft Establishment during a conference on the transistor, held in Washington DC in 1952. He reasoned that as it was possible to manufacture other electronic components, such as resistors and capacitors, from semiconductor materials like germanium, which was used  to produce the first transistors, there was no reason why all of the elements in an electronic circuit couldn't be combined together on one piece of semiconductor material.



Similar ideas had been tried out, with some success, a fifteen years earlier when valve manufacturers came up with one-piece valve amplifiers and radio tuners, with all or most of the ancillary components built into the valve's glass envelope, however, like all thermionic devices they had a limited life, were mechanically fragile and far more expensive than conventional methods of construction.



The manufacturing processes and materials available in the early 1950's were simply not capable of  producing Dummers ICs. However, by 1959 germanium was being replaced by highly purified silicon which was both cheaper to produce and easier to work with. That same year semiconductor engineers had managed to construct a simple electronic circuit , called a multivibrator,  using two transistors, eight resistors and two capacitors, mounted  on top of a single slice of silicon. The first genuine IC, made by diffusing impurities into microscopic layers of silicon was developed in 1960 using the 'planar' process. Those first chips were developed for use in American guided missiles.



Early IC were essentially collections of single discrete components, usually just transistors and resistors, connected together by thin gold wires, however, by 1962 scientists had devised ways of isolating and interconnecting individual components, these were known as monolithic ICs. The first ICs were used for simple switching functions and were the forerunners of today's digital microcircuits; it wasn't until the late 1960s, when manufacturing tolerances had been sufficiently improved that the first practical analogue or 'linear' ICs were developed. During the 1960's IC development  given enormous impetus by the American space programme which put Neil Armstrong on the Moon in 1969. The navigation computer on Apollo 11 was the electronic marvel of the age though by contemporary standards it was not much smarter than one of today's pocket calculators.



ICs are manufactured using a highly sophisticated  photolithography process; wafers of silicon are coated with photoresistive materials and exposed to ultraviolet light, though photographic masks. Acids are used to etch away unexposed areas, the wafer is then coated with further layers of silicon, doped with various impurities, and the etching process repeated using a succession of masks to build up the complex three dimensional circuit elements. Each wafer can contain hundreds of identical microcircuits, each with many thousands of transistors and other components, though until the early 1980's there was a very high failure rate and ICs were very expensive to produce, which limited their use in domestic equipment.



The breakthrough into the consumer market occurred in the early 1970's with the introduction of the first  pocket calculators. In 1970 there were just two very expensive models on sale, by 1974 there were over four hundred! This sudden proliferation was brought about almost single-handedly by the Texas Instruments TMS 0100 LSI (large-scale integration) microchip which was used in the majority of calculators for some years. The other key development, which also took place in the early 1970's was the microchip memory device, the most important of which is the RAM  chip or random access memory.



From the calculator and memory IC it was a relatively short step to the one-chip computer, or microprocessor, sometimes called the universal microchip because of its almost unlimited applications. Prior to this it was usual to design custom chips for specific tasks, which was both expensive and time-consuming. The microprocessor is unique because it can be programmed to perform a multitude of  jobs, under instruction from information held in its own on-board memory, or an external program. Microprocessors are now common in a wide range of everyday appliances, from microwave ovens to washing machines but their greatest impact on our society, now and in the future, has been in the development of the microcomputer, now an essential part of living and working in the late twentieth century.



The IC has made home video possible and it would not be possible to build a camcorder without microchips, at least, not one that you would be able to carry about on your own... Camcorders are designed by computers and built using computer-controlled robots. In addition to a score of ICs some camcorders now contain several microprocessors, controlling the cameras transport functions, operating the exposure and white balance systems, generating on-screen displays, graphics and titles. VCRs depend on them for deck control operations, timer and display functions,  televisions have them in their tuners, picture control circuits, remote control systems and teletext decoders.




(c) R.Maybury 1993 0104




[Home][Software][Archive][Top Tips][Glossary][Other Stuff]

Copyright (c) 2005 Rick Maybury Ltd.