Gordon E. Moore, co-founder of Intel, noted in 1965 that the power of integrated circuits doubles every two years.  This, known as Moore’s Law, is the basis for what I believe to have had the most profound impact on American History. Without integrated circuits, culture changing devices such as the iPhone would not be possible. These circuits were created in 1958 by Jack Kilby, inventor of the hand-held calculator and employee of Texas Instruments.  By having all the necessary components made out of the same material on the same chip, integrated circuits make electrical devices smaller, faster, and smarter. 3] This has greatly impacted the entire world, especially in America, where technology is imperative to our every day lives. Not only does it impact us now, but it has affected us for over fifty years.
The first forms of integrated circuits, called “small-scale integration” circuits, were crucial to aerospace projects.  The decision for the Apollo Guidance Computer in the Apollo program to use integrated circuits was critical to the NASA computer’s success.  The program bought so many circuits that NASA’s Apollo Program was “the largest single consumer of integrated circuits between 1961 and 1965.  The nation’s space association was not the only customer of these innovative chips. One of the major new features provided by the U. S. Military’s Minuteman-II nuclear missile was an improved guidance system, incorporating semiconductor integrated circuits and miniaturized discrete electronic parts.  The Navy also took part in the revolution’s beginning. Their Northrop Grumman E-2 Hawkeye used integrated circuits in radar and radio communications to upgrade the aircraft’s performance.  By the end of the decade, a new generation of chips was born.
In the late 1960s, devices which contained hundreds of transistors on each chip, were called “medium-scale integration. ” Even though medium-scale integrated circuits cost a little more to produce, they were still economically attractive because they allowed more complex systems to be produced using smaller circuit boards.  When faster and more affordable computers became available, students at Dartmouth College in 1964 created a computer programming language called BASIC, which was usable by those lacking a mathematical background and was relatively easy to learn.  After another decade passed, more advances occurred.
In the mid 1970s, integrated circuits entered another new generation called “large-scale integration. ”Computer mouses, bar code scanners, ATMs, and ARPAnet, the first Internet, were invented using this generation of circuits.  Although 1K-bit RAMs, calculator chips, and the first microprocessors that began to be manufactured in moderate quantities in the early 1970s are included in large-scale integration, true large-scale integrated circuits began to be produced around 1974.  These were the first modern computer chips; they had memory and were second-generation microprocessors. 15] The final generation, starting in the 1980s and continuing through the present, is “very large-scale integration,” or VLSI.  They are home to microprocessors, or “cores. ” These “control everything from computers and cellular phones to digital microwave ovens. ” Though there are countless more devices that use integrated circuits, it is obvious that taking modern computers, mobile phones, and microwave ovens out of Americans’ daily lives would be drastic. For instance, there would be no World Wide Web, phones would be limited to talking only, and preparing left-overs from last night’s dinner might as well be impossible.
Today’s image of America would not exist without integrated circuits, and integrated circuits still extend to shape tomorrow’s image of America. Even though their impact is already great, integrated circuits continue to shrink in size and lower in price with keeping the same amount of power. “If the auto industry advanced as rapidly as the semiconductor industry, a Rolls Royce would get a half a million miles per gallon, and it would be cheaper to throw it away than to park it. ” – Gordon E. Moore.