Why Is Silicon Used for Most Computer Chips?by Sarah Morse
In 1965, Gordon Moore predicted that the number of transistors on integrated circuits would double approximately every two years, making computers faster and more powerful. His statement, named Moore's Law, remains true at the time of publication. It was the ease and flexibility of silicon that made this kind of rapid development possible.
A semiconductor lies somewhere between a conductor and an insulator. Conductors, like copper and other metals, make it difficult to control an electric signal. Insulators, like glass and rubber, block electric signals. Semiconductors, and silicon in particular, can do a little of both. Depending on how manufacturers treat the element, silicon can conduct, insulate or do something in between. The treatment is called "doping," a process that introduces impurities into the silicon crystals.
Silicon is not the only semiconductor; carbon and germanium also have similar properties. Carbon, in its diamond form, is too brittle to use in chips. Germanium chips were used early in the computer era; the element is still sometimes used in chips today. Silicon, however, can remain a semiconductor at much higher temperatures than germanium. This becomes important when chips are deployed in computers near other electronic elements that retain heat.
Unlike other semiconductors, silicon's conductivity is very easy to change. Through the doping process, manufacturers can introduce elements that make silicon more conductive, less conductive and even non-conductive. This means that manufacturers can use fewer materials for chips, making more intricate circuits for increased function.
After oxygen, silicon is the second most abundant element on Earth. It can be extracted from sand relatively easily. This availability, combined with the ease of creating circuits with silicon, makes it very inexpensive to produce, compared with other semiconductors.
- photo_camera Kim Steele/Photodisc/Getty Images
Click here to provide feedback on this article