In 1947, Bell Labs built the transistor in New Jersey. It amplified signals and switched circuits without the heat and fragility of vacuum tubes, and engineers understood at once that electronics would be reorganized around it. In 1956, its co-inventor William Shockley left to found a semiconductor firm in Mountain View, California, near his hometown.
Shockley proved difficult to work for. In 1957, eight of his engineers — Gordon Moore, Robert Noyce, Eugene Kleiner, and five others — left to form Fairchild Semiconductor. Fairchild scaled rapidly and then replicated the pattern: Moore and Noyce departed in 1968 to start Intel; Kleiner became a venture capitalist; dozens of alumni founded new firms. The Valley’s semiconductor cluster did not descend from a single company so much as from repeated defections that circulated expertise, capital, and ambition.
The region was not a blank slate. Stanford in the 1950s pushed faculty into commercial ventures and leased campus land to technology firms. At the same time, the Pentagon steered billions in defense contracts to electronics companies near the university, concentrating demand and accelerating the formation of a skilled workforce. Shockley chose a place already assembling the prerequisites for scale — but by bringing transistor manufacturing west, he fused breakthrough technology to an emerging ecosystem.
In 1965, Moore observed that the number of transistors on a chip was doubling roughly every two years while costs fell. The statement functioned less as prophecy than as coordination device: chipmakers, equipment suppliers, and software firms aligned their roadmaps to it, turning expectation into practice. Each generation of chips made new software economically viable, which expanded markets for the next generation of hardware. The feedback loop persisted for decades, surviving downturns and technical bottlenecks.
By the mid-1970s, engineers in the Valley were building computers small enough for a desk. In 1977, Steve Jobs and Steve Wozniak began selling the Apple II, proving that small businesses and households would buy personal machines. In 1981, IBM entered with a personal computer assembled from outside components and a licensed operating system, and it published the specifications. Competitors cloned the design. Prices declined, volumes rose, and demand for chips, storage, and peripherals flowed back through the region’s suppliers.
The internet began as a government research network in the late 1960s and opened to commercial use in the early 1990s. Cisco, founded in 1984 by two Stanford employees who had built a router to connect incompatible campus systems, sold the routing equipment that made the shared network expand. Once connectivity was widespread, discovery became the constraint. In 1998, Stanford graduate students Larry Page and Sergey Brin founded Google around a method that ranked pages by the quality of their links. The same system priced attention through targeted advertising, generating profits that financed further expansion across the Valley’s firms.
By 2000, within roughly fifty miles, the semiconductor, the personal computer, and the commercial internet had each been developed, scaled, and first sold at mass market. Three technological revolutions compounded one another in a single geography, reinforced by talent mobility, institutional support, and markets large enough to sustain repeated reinvention.
Leave a Reply