A Brief and Fun History of Coding: Silicon Valley
Following that crazy time at the height of the Cold War known as the Space Race, many engineers and various people in the newborn stage of computer development began to recognize the value of computing technology and what that meant for the world as a whole.
These advancements had lifted man into the upper reaches of our atmosphere and even landed him on the moon—what else could they do?
But even prior to these events, the groundwork was being laid in another part of the country that would help set the stage for unprecedented growth in the field of technology. For reasons that are somewhat disputed—possibly because we like to argue a lot as humans—a great many of those who realized this potential gathered in a region of Central California known as Santa Clara Valley.
At one point responsible for producing 30% of the world’s plums, the bountiful farmland was a neighbor to Stanford University, which came into existence largely due to its namesake making a fortune with railroads bringing people and goods to and from these crops. Two alumni from the prestigious research institute became the original “garage startup” that still directly impacts how the computing world operates.
William Hewlett and David Packard began their business in 1939 by making electrical test equipment out of a one-car garage in Palo Alto at the behest of Frederick Terman, an electrical engineer with Stanford who not only helped turn the university’s program into one of the best, but also worked hard at convincing many of its graduates to stay in the area and also start their own businesses.
Both during and following the war, the Department of Defense began pouring money into the technological development groups in Stanford’s sphere of influence. Relevant to our concerns here is that, according to some sources, Stanford increased its Freshman class size in 1948 by over 1000, mostly due to the newly-formed GI Bill, which returning war veterans could use to go to college.
This led to a substantial increase in the fields of engineering specialists, and innovators in the field recognized that very thing. After some in-fighting that would take a soap opera to explain (or make you more confused, whichever), Nobel Prize winner William Shockley moved west and opened a semiconductor business in Mountain View, California and hired the best and the brightest Stanford had to offer.
These individuals were so good, in fact, that in a funny twist of fate they outsmarted their boss and recognized that the materials he was using for semiconductors was not as heat-resistant as other materials. After Shockley refused to change, the “traitorous eight” (as Shockley called them) left to form Fairchild Semiconductors in 1958 and produced the first integrated circuit on silicon—arguably the most important invention of the computer age.
To put this in perspective regarding the importance of timing of events and how it would impact the advancement of computing technology, it was 1957 that saw the first-ever human-created satellite orbiting the earth in space—the Soviet Union’s Sputnik-1. The immediate need for smaller, lighter, and more efficient computing technology was nowhere considered more important than the pun-intended launching of the Space Race, and the US government took note of what was happening in California.
With the Cold War becoming a greater and greater concern, many in the US saw the benefit of technology, even beyond the obvious uses in the Space Race. For example, it is a rather short line to draw from the US Department of Defense’s recognition of the need for advanced technologies to the creation of DARPA (Defense Advance Research Projects Agency) and subsequently ARPAnet—the precursor to the Internet.
The combination of private industry recognizing a need for integrated circuits that could house greater amounts of data in smaller spaces with Cold War concerns that compelled Defense Department agencies to subsidize these industry advancements (and demand more) led to an explosion of growth in the area we now know as Silicon Valley, with numerous top companies setting up shop there.
Two of the founders of Fairchild Semiconductors left that organization to found a company named Intel in 1968—one you may have heard of, given that they invented the first microprocessors and became one of the most successful and influential companies in the world. AT&T, Texas Instruments, and Energizer all debuted heavily influential products there. And Apple, whose innovations continue to reshape our world, seems to have arisen mostly from the desire to do something different than what everyone else was doing in the same area.
But there is a key moral to this story that goes beyond a simple history of how Silicon Valley shaped the computing world. While it is interesting to know how we got here, history is much better applied to the field of “lessons learned” than simply the novelty of the past, even in regards to the world of coding, and from the same lesson we can see both a cautionary note as well as an encouraging one.
The lesson here is that Silicon Valley is somewhat of a Black Swan, and flowing from that in true Nassim Taleb-like fashion are both warnings and inspiring admonitions. For example, the confluence of events leading to the innovations of Silicon Valley cannot be duplicated. World War II, the rise of the Soviet Union, the Space Race, and the invention of the integrated circuit were highly unique occurrences all in their own right, but the fact that these events took place in the lifespans of one generation—along with so many other factors—make this an impossible set of circumstances to recreate. Although other Black Swans will surely come along, be leery of those who race to duplicate the unpredictable.
The inspiration, however, is in the recognition that creativity often comes from not knowing any better and not having any rules constraining the creator simply by nature of the fact that they are not yet written. Those who developed the first microprocessor didn’t do so because it was written down that they had to do it—they needed something that worked more efficiently and were afforded the freedom to problem-solve the best way they saw fit.
Software engineering and development is, broadly speaking, about creating a solution to a problem that has never been solved by using the structures of languages while not being totally constrained by how it has been used in the past. Like writing a book, developing software may use the same language as numerous other books already published, but in a combination never before seen.
The history of Silicon Valley should show us that the goal is not to duplicate, but rather to learn the lessons of what really smart people have done and make it a little bit better. Come to Code Platoon to figure out how you can start doing exactly that in the world of computing and software development.
Greg Drobny is a former Airborne Infantryman, PSYOP Team Chief, political consultant, professional mil blogger, and is Code Platoon’s Student Outreach Coordinator. He holds a BA in history, a Masters of Science in organizational psychology, and is currently pursuing an MA in history. He is married with four children who keep him more than slightly busy and is passionate about helping veterans find their paths in life and develop the skills needed to pursue their goals.
Subscribe To Our Newsletter
Join our mailing list to receive the latest news and updates from our team.
Thank you for subscribing to our email list!