The History Of Semiconductor Industry

March 29, 2023

/

Semiconductors

The History Of Semiconductor Industry | Inquivix Technologies

Would you like to learn the history of the semiconductor industry? From the earliest experiments into semiconductor properties of the 19th century, the invention of the first bipolar junction transistor in the 1940s, to the most sophisticated microprocessors powering the advanced artificial intelligence technologies of today, the semiconductor industry has come a long way. Global competitiveness among semiconductor manufacturers is at an all-time high with constant research, development, and innovation needed to meet the demands of the future.   

Let’s find out more about the main milestones in the history of the semiconductor industry, as well as the individuals and companies that were responsible for driving this innovation. We’ll also go over how semiconductor technology went from being used almost exclusively in the military, aerospace, and national security applications to the electronics that are currently in the hands of the average consumer. 

Semiconductor Technology And Their Importance

Semiconductor technology is a vast and rapidly growing industry that plays an essential role in our lives. The microprocessors or chips that power our smartphones, laptop computers, video gaming consoles, smartwatches, and many other devices we commonly use today contain millions and sometimes billions of tiny semiconductor components within them. Advances in semiconductor tech have enabled consumer electronics to become smaller, faster, and more reliable.  

A Modern Microprocessor | INQUIVIX TECHNOLOGIES
A Modern Microprocessor
Image taken from www.futurecdn.net

Additionally, semiconductors are essentially for the electronic circuits that are used in telecommunication networks, internet of things (IoT) applications, high-performance computing in data centers, advanced medical equipment, industrial manufacturing systems, smart vehicles, weather monitoring, military, and national security applications. Therefore the semiconductor industry has been a key driver of innovation and economic growth as it continues to spur development in numerous industries around the world.

For semiconductor manufacturing to meet the growing demand, both government sector funding and private investment have been key to building state-of-the-art fabrication plants that can supply the volume needed. As a result, the semiconductor industry has grown rapidly in the past decades with firms from Taiwan, South Korea, the United States, and China dominating the industry space and taking up significant market share. The semiconductor industry worldwide had a revenue of over US $ 580 billion in 2022. 

For new semiconductor materials, and components to be developed that keep up with Moore’s Law, the semiconductor industry also relies on research and development. Universities around the world continue to lead the way in the technological advancement of material science, and many leading firms even recruit their engineers directly from such institutions to maintain their lead in the face of global competitiveness. 

Massive investment is also used to improve manufacturing yields through new processes, cutting-edge equipment, and industry 4.0 concepts like smart manufacturing in the semiconductor industry. It’s safe to say that semiconductor materials and the technology made from them will continue to be vital for the development of many new products for years to come. 

The History Of The Semiconductor Industry

Let’s look at the most significant milestones in the history of the semiconductor industry, the research that led to them, the effect each innovation had on semiconductor-based technology, and how these advancements led to the subsequent development of other industries around the world.   

The Discovery Of Semiconductors

It was the English scientist Michael Faraday best known for his work on electromagnetism who first noticed in 1833 how the electrical resistance of some materials decreases with rising temperature. This was different from how conductors like Copper behaved. Later in 1873, electrical engineer Willoughby Smith discovered that resistors made from Selenium show a reduction in resistance when light falls on them. He is credited with finding the photoconductivity of Selenium. 

Sir Alan Herries Wilson who came up with the band theory that explained how solid materials conducted electricity  | INQUIVIX TECHNOLOGIES
Sir Alan Herries Wilson who came up with the band theory that explained how solid materials conducted electricity
Image taken from www.cdn.shopify.com

There were also other important observations such as conduction and rectification in metallic sulfides by Peter Munck af Rosenschold and Karl Ferdinand Braun on multiple occasions, and the photovoltaic effect of Selenium in 1876 by William Grylls Adams and Richard Evans Day. While many scientists conducting research into the electrical properties of materials did observe the phenomena that we recognize today as semiconductor properties, it would require a proper unified theory of solid-state physics to explain how and why these materials behaved the way they did. 

After the electron was discovered by the British physicist Sir J.J. Thomson in 1897, many theories cropped up about how the movement of charge carriers like electrons allowed the conduction of a current in solid materials. The term ‘semiconductor’ first appears as ‘Halbleiter’ in a Ph.D. thesis by Josef Weiss in 1910 while his teacher Johan Koenigsberger was the first to mention the term ‘variable conductors’ as a category of solid material along with conductors and insulators. 

The band theory which explains how solid materials conduct electricity was established by British mathematician Sir Alan Herries Wilson in 1931. The characteristics of a metal-semiconductor junction were first modeled by Walter H. Schottky and Nevill Francis Mott, and Boris Davydov did pioneering work about the p–n junction, and minority carriers.

Quantum mechanics was not well understood in these early days, and many experiments didn’t deliver satisfying results that matched theoretical models due to impurities present in semiconductor materials being tested. However, this did result in the development of better material refining methods which helped later experiments. To learn more about band theory and how semiconductors conduct electricity, read Understanding the Chemistry of Semiconductors

The Earliest Semiconductor Devices

Some of the earliest semiconductors used in the 20th century made use of the light-sensitive properties of the materials. British engineer H.J. Round passed a current through Silicon carbide crystals and noticed the emission of light in 1906 and Oleg Losev found the same in 1922. They were not aware of the practical implications of this, but these discoveries would pave the way for the invention of the light-emitting diode (LED) later.  

Jagadish Chandra Bose built a point-contact microwave detector also called a ‘cat’s whisker detector’ in 1904 using a semiconducting material called Galena (Lead Sulfide). Some of the earliest radio receiver components designed by G. W. Pickard were made using these crystal detectors. The devices were very unpredictable and required many adjustments to be made manually before use. While more reliable vacuum tubes soon replaced the cat’s whisker detectors, they were nevertheless vital to the development of radio technology. This type of crystal detector was one of the first semiconductor electronic devices that were practically useful, and it was also one of the first semiconductor diodes. 

A Galena cat's whisker detector used in crystal radio receivers | INQUIVIX TECHNOLOGIES
A Galena cat’s whisker detector used in crystal radio receivers 
Image taken from www.wikimedia.org

During World War II, military engineers soon discovered the limitations of vacuum tube technology, mainly their limitations at detecting radio frequencies above 4000 MHz. Research began on Lead Sulfide and Lead-Selenide materials which showed promise in infrared detection of aircraft and naval vessels, as well as voice communications systems. The point-contact crystal detector made a comeback for microwave radios, and many firms working for the US government finally began research into Silicon-based semiconductor materials which showed much potential. 

To learn more about the semiconductors used for their radio signal applications, read The RF Semiconductor And Frequency Range

The First Diodes

In the late 1930s, American scientist Russell Ohl who was working for Bell Laboratories revisited the cat’s whisker device and realized that using more refined Lead Sulphide crystals improved its reliability. He also observed that light falling on the crystal improved its conductivity. When Ohl presented his findings to his colleagues at Bell Labs, physicist Walter Brattain realized that the visible crack in the crystal acted as a junction. 

Further studies revealed that the two sides of the crystal which were separated by the crack had different concentrations of impurities. This resulted in one side having extra electrons and the other seemed to collect electrons. The two sides would eventually become known as the ‘emitter’ and ‘collector’ respectively. When a voltage was applied, electrons could be pushed from the emitter to the collector, creating an electric current. This didn’t work if the voltage was reverse-biased. 

This is essentially a solid-state diode, and the phenomena it displayed was called ‘semiconduction’. Armed with the knowledge of how charge carriers both negative and positive behaved around the junction, many different institutions got to work. With the government ready to invest, companies like Bell Labs, and universities like MIT, Purdue, and the University of Chicago got to work on improving the crystals. The result was Germanium being tested and perfected to create diodes for military radar systems during the war. 

The First Transistor

A Model Of The First Transistor | INQUIVIX TECHNOLOGIES
A Model Of The First Transistor
Image taken from www.theconversation.com

Ohl and Brattain continued their research into the mobility of electronics in semiconductors which eventually began a new branch related to quantum mechanics called ‘surface physics’. Their understanding of the depletion region of the diode along with the rest of the team including  William Shockley and John Bardeen led to the realization that a third lead was needed in addition to the emitter and collector leads. This third lead known as the ‘base’, was needed to control the behavior of the electron flow across the junction. 

The team at Bell kept working to develop these ideas into a working device. Eventually, on December 23rd of 1947, they had the first official demonstration of the p-n-p point-contact Germanium transistor. The device was able to amplify a signal with a power gain of 18. The term ‘transistor’ was suggested by John R. Pierce when Bell Labs needed a new name for the device. Others like ‘semiconductor triode’, surface states triode, and ‘crystal triode’ were considered as well. In 1956, the team members including William Shockley, John Bardeen, and Walter Brattain were awarded the Nobel Physics prize for their invention. 

Improving The Transistor Design

William Shockley would invent the bipolar junction transistor (BJT) in 1948, which was an improvement on the point-contact transistor. It was so robust in fact, that the device design became a vital part of both discrete and integrated circuits until the 1960s when it was eventually replaced by better technology. The materials, manufacturing methods, and other features kept getting refined over the next decade.  

Since Germanium was too sensitive to temperature and had purity issues which lowered the yield, it was replaced by Silicon which became the semiconducting material of choice. Gordon K. Teal who moved to Texas Instruments from Bell Labs developed the first Silicon-based transistor. Throughout the 1950s, the transistor kept evolving with new improvements added every year. Intermediate designs like the alloy-junction transistor and mesa structure semiconductor transistor are no longer in use. 

The planar transistor developed at Fairchild Semiconductor by Dr. Jean Hoerni would prove to be a real breakthrough that would affect the entire semiconductor industry in terms of manufacturing. The planar transistors could be mass-produced quickly, and cost less, and their inclusion of the silica passivation layer prevented the junctions from degrading over time. The planar transistor made all previous transistor designs obsolete and became a key driver for the success of the entire semiconductor industry moving forward. 

From National Security And Military Applications To Commercial Use 

The Sonotone 1010 Hearing Aid  | INQUIVIX TECHNOLOGIES
The Sonotone 1010 Hearing Aid 
Image taken from www.wikimedia.org

With World War II being over, the existing industry structure changed with the commercial production of transistor technology beginning in the early 1950s. Western Electric had the first commercial transistor production line capable of making point-contact Germanium transistors. Telecommunications equipment made by AT&T as well as hearing aids made by Sonotone were some of the first commercially available products to use Germanium transistors. 

The Transistor Radio

The real game changer that took the transistor technology into the hands of millions around the world was the pocket transistor radio. Intermetall, a German company was the first to demonstrate a working prototype at the Düsseldorf Radio Fair in 1953. The device had four hand-made transistors in it. In 1954, the first commercially available transistor radio was released by the Regency Division of Industrial Development Engineering Associates. The ‘Regency TR-1’ as it was called had four Germanium transistors manufactured by Texas Instruments and sold well for the time.

The Regency TR-1 Transistor Radio | INQUIVIX TECHNOLOGIES
The Regency TR-1 Transistor Radio
Image taken from www.jamesbutters.com

Tokyo Telecommunications Company also released a transistor radio in 1955, the TR-55 after acquiring a manufacturing license for the junction transistor technology from Bell Labs. It was very similar to the Regency TR-1 but didn’t sell well outside of Japan. Chrysler and Philco worked to make the first all-transistor car radio and released it as an option in their 1956 line of new automobiles. 

In 1957, Tokyo Telecommunications Company, now going under the new brand name Sony, released the TR-63, the world’s first mass-produced transistor radio. It took the world by storm, selling seven million units by the mid-1960s, and capturing most of the market share in pocket radios. They also paved the way for other Japanese companies like Sharp and Toshiba to enter the consumer electronics market. 

Vacuum tubes were soon replaced by transistors which was another key driver that boosted the semiconductor industry moving forward. The transistor became a household name, the first semiconductor device to do so. The transistor also became useful as a vital component in many of the IBM computers of the late 1950s and early 60s that were sold commercially. 

The MOSFET, And The First Integrated Circuits

While pushing the semiconductor industry towards successful commercial applications, a replacement was needed for the junction transistors. The devices were too bulky which limited their practical applications. Researchers on the cutting edge of semiconductor technology knew that the field-effect transistor (FET) would likely be the next step. 

Egyptian engineer Mohamed Atalla was the first to propose coating Silicon wafers with a Silicon Oxide insulating layer to passivate the surface state issues that were causing problems. This was the metal-oxide-semiconductor (MOS) process that allowed Atalla and Korean-American Engineer Dawon Kahng to develop the first MOS field effect transistor (MOSFET) in 1959. 

A Metal Oxide Silicon Field Effect Transistor | INQUIVIX TECHNOLOGIES
A Metal Oxide Silicon Field Effect Transistor
Image taken from www.wigatos.com

Alongside these innovations, Jack Kilby from Texas Instruments made one of the earliest integrated circuits (ICs or chips) where all the circuit components were produced on a single piece of semiconductor material, in this case, Germanium. In 1959, Robert Noyce at Fairchild Semiconductor made the first true monolithic IC with Silicon using the planar process. ICs dramatically increased the performance of a circuit since signals no longer needed to travel down lengthy wires from input to output. The US Air Force and NASA were the main consumers of such microchips during the early 1960s. 

The MOSFET had much lower power consumption than a traditional junction transistor and could be mass-produced easily. It was also the first transistor that could be miniaturized reliably. This made it possible to manufacture large numbers of transistors on a single chip. The first IC made with multiple MOSFETs was a 16-transistor microchip made by the Radio Corporation of America (RCA) in 1962. The first commercially available one had 120 transistors and was made by General Microelectronics in 1964.  

Based on the MOS process, Chih-Tang Sah and Frank Wanlass who worked for Fairchild Semiconductor improved it with a new technique called ‘complementary MOS’ or CMOS in 1963. The MOS technologies evolved quickly with more and more transistors being packed onto a single chip as predicted by Moore’s Law. The MOSFET is considered one of the most important inventions, not just in the semiconductor industry but in modern electronics. It would eventually lead to the development of the first computer microprocessors in the 1970s.   

Very Large-Scale Integration In The Semiconductor Industry

The MOS chips continued to evolve during the 1970s to the point where it was possible to fit over 10,000 transistors on a single chip. The process that enabled this was called ‘very large scale integration’ (VLSI). Soon, everything from home appliances to business machines and telecommunications equipment started using these microchips.  

An Integrated Circuit Made With VLSI Design | INQUIVIX TECHNOLOGIES
An Integrated Circuit Made With VLSI Design
Image taken from www.archive.fo

The development of MOS chip technology led to the first commercially produced microprocessor, the Intel 4004 released in 1971 which was utilized in business calculators. With microprocessors becoming smaller and more affordable to the general public through mass production, this led the way for the personal computer revolution of the late 1970s and 80s. The 8-bit Intel 8080 microchip was used in the Altair 8800, considered by many to be the first personal computer. 

The methodology for VLSI design was conceptualized by Carver Mead and Lynn Conway. This allowed semiconductor manufacturers to minimize the interconnect fabric area, thereby saving valuable space on the microchip. Their book titled ‘Introduction to VLSI Systems’ was published in 1979 and used for integrated circuit educational courses around the world. 

The VLSI design concepts in this book eventually led to the first Silicon Compiler software that could generate an integrated circuit design based on a user’s specifications. Carver Mead was also an advocate of fabless manufacturing. This is a process where device manufacturers would specify the microprocessors they need to semiconductor companies that will design the chips to the customer’s specifications. Then the actual fabrication of the chips would be outsourced to companies that have fabrication facilities. This is mainly how the semiconductor industry operates today. 

Fabless Manufacturing In The Semiconductor Industry Today

Fabless Companies, Integrated Device Manufacturers, And Foundries | INQUIVIX TECHNOLOGIES
Fabless Companies, Integrated Device Manufacturers, And Foundries
Image taken from www.poems.com

Before the 1980s, many companies in the semiconductor industry designed the chips, operated Silicon-wafer fabrication facilities, came up with their manufacturing processes, tested their chips, and sold them to customers. These are the integrated device manufacturers (IDMs) with Intel and Samsung being current examples. The barrier to entry used to be very high since semiconductor manufacturing was technology-intensive. 

Later, the fabless model spread across the industry with smaller companies designing and selling microchips without actually operating a fabrication facility. Nvidia, Apple, and AMD are such companies, and the fabrication of their ICs is handled by TSMC and GlobalFoundaries which focus solely on fabrication.  

The global competitiveness seen in the semiconductor industry today has led many leading IDM companies to have facilities spread all over the world. Intel is no longer just in the US but in Asia and Europe as well. The largest foundry operator TSMC is located in Taiwan, Singapore, and the US. Samsung which is the largest semiconductor manufacturer is also in the US in addition to its home country of South Korea. Even fabless firms like Qualcomm have facilities across the world. 

The modern semiconductor components that are currently in demand are microprocessors used in high-performance computing applications, flash memory devices, field programmable gate arrays (FPGAs), application-specific integrated circuits (ASICs), and complex programmable logic devices (CPLDs). 

The Future Of The Semiconductor Industry

The semiconductor industry today is working on the nanometer scale with transistor densities going upwards of hundreds of billions on a single microchip. Apple’s M1 Ultra has over 114 billion transistors, making it the highest in any consumer device at the moment. This was possible using TSMC’s semiconductor manufacturing processes at the 05-nanometer level. Advanced chip design like this is done using electronic design automation (EDA) tools which help engineers come up with innovative designs, and run simulations long before any physical device is produced. 

Research continues to take semiconductor manufacturing even further to produce smaller, and even more sophisticated ICs for the high-performance computing requirements of tomorrow. To learn more about semiconductors, integrated circuits, their manufacturing methods, and the industry, check out Inquivix Technologies Blog today!

FAQs

 When Did The Semiconductor Industry Get Started?

The semiconductor industry began in the 1950s with the introduction of the first commercially available junction transistors used in consumer electronics like pocket radios. Another important industry milestone was the invention of MOSFET technology and the first integrated circuit in the late 1950s and early 60s. 

What Are The First Generation Semiconductor Materials Used In Early Components?

Germanium and Silicon are the semiconductor materials used in the earliest transistor devices made in the 1940s. However, the cat’s whisker detector that predates the transistor used Galena (Lead Sulphide) crystals as the semiconducting material.   

What Is Fabless Semiconductor Manufacturing?

Some firms in the semiconductor industry are fabless. They design and sell microchips to customers, but outsource the manufacturing to a separate fabrication plant. 

Latest Posts