HOW AMERICA LAUNCHED THE DIGITAL AGE

1925–1945
Exterior view of Bell Telephone Laboratories building with trees and walkway. (Library of Congress/Gottscho-Schleisner)

BELL LABS IS BORN

“How do you manage genius?
You don’t.” 

—  Mervin Kelly,
president of Bell Labs (1951
1959)

Portrait of Mervin J. Kelly in a suit, photographed for Bell Telephone Magazine. (Bell Telephone Magazine)

Bell Labs, a legendary research hub in New Jersey, began as a branch of the Western Electric Company, a subsidiary of the American Telephone and Telegraph Company (AT&T).

Founded in 1925 to meet a growing need for mass communications, Bell Labs hired top engineers, physicists, chemists, and mathematicians to design and patent equipment (including a high-vacuum tube that transmitted telephone signals across North America).

Three engineers adjust large television transmission apparatus inside Bell Labs lab, 1920s (© Underwood Archives/Getty Images)

Bell Labs engineers test early television transmission equipment. (© Underwood Archives/Getty Images)

Bell Labs engineers test early television transmission equipment. (© Underwood Archives/Getty Images)

Bell Labs encouraged interdisciplinary collaboration that produced groundbreaking discoveries. The labs were driven by scientific curiosity, flexible deadlines, and — thanks to AT&Ts budget — stable funding. Lab directors adopted a hands-off management style, and innovation flourished.

Karl Jansky sits beside his large rotating radio antenna used to detect cosmic radio waves, 1930s. (© Bettmann/Getty Images)
DID YOU KNOW?

In 1932, Bell Labs physicist Karl Jansky discovered radio waves coming from outer space. He’s known as the father of radio astronomy.

THE LATE 1940s

Close-up of the first transistor displayed on a small pedestal at Bell Labs. (© James Leynse/Corbis/Getty Images)

A REVOLUTIONARY TRANSISTOR

“The legacy of Bell Labs is in every electronic device we use today.” 

—  Jon Gertner, author of
The Idea Factory: Bell Labs and the Great Age of American Innovation

In the post-World War II period, Bell Labs’ Mervin Kelly assembled an all-star team of scientists to develop a replacement for the vacuum tube, which was bulky, fragile, and prone to burning out.

In 1947, John Bardeen and Walter Brattain — supervised by fellow physicist William Shockley — invented the point-contact transistor, a semiconductor device that amplifies sound and switches electrical currents on and off.

Three scientists work together in a lab surrounded by instruments, 1950s (© Hulton Archive/Getty Images)

Bell Labs physicists John Bardeen, William Shockley, and Walter Brattain conducted experiments that redefined electronics. (© Hulton Archive/Getty Images)

Bell Labs physicists John Bardeen, William Shockley, and Walter Brattain conducted experiments that redefined electronics. (© Hulton Archive/Getty Images)

In 1948, Shockley designed the junction transistor, a more robust and reliable transistor. Its small size, low power consumption, and durability paved the way for computers, portable radios, cell phones, and other devices.

Eight years later, Bardeen, Brattain, and Shockley would be awarded the Nobel Prize in physics for this breakthrough.

William Shockley receives Nobel Prize medal from King Gustav VI Adolph in Stockholm, 1956. (© AFP/Getty Images)
DID YOU KNOW?

Bell Labs researchers have been awarded 10 Nobel Prizes in physics and chemistry, spanning from 1937 to 2023. While Bell Labs was at its most productive from the 1940s to the 1970s, important research continues today at its New Jersey headquarters.

1950s–1970s
The Shockley Semiconductor Laboratory, Beckman Instruments Inc. building in Mountain View, California (Courtesy of Computer History Museum)

SILICON VALLEY AND MICROCHIPS

“The expertise and the spirit of Bell Labs spread outward, and in that sense, you can say the baton really did pass to Silicon Valley.”

—  Jon Gertner, author of
The Idea Factory: Bell Labs and the Great Age of American Innovation

Bell Labs continued to improve transistor technology during the 1950s, developing the silicon transistor and the metal-oxide-semiconductor field-effect transistor (MOSFET).

The MOSFET proved crucial for building high-density integrated circuits (ICs), or microchips, in the 1960s. Microchips — consisting of billions of tiny transistors crafted from semiconductor materials, commonly silicon — work together to power electronics.

Recognizing the potential for widespread impact and profits, Bell Labs created licensing agreements to share transistor technology with other companies.

Technicians examine spherical Telstar communications satellite inside Bell Labs lab, early 1960s (© Keystone-France/Getty Images)

Engineers at Bell Labs test the Telstar satellite, a breakthrough in global communications. (© Keystone-France/Getty Images)

Engineers at Bell Labs test the Telstar satellite, a breakthrough in global communications. (© Keystone-France/Getty Images)

In 1955, William Shockley left Bell Labs to establish Shockley Semiconductor Laboratory in Mountain View, California. Within a couple of years, some of his employees — engineers and scientists — formed their own company, Fairchild Semiconductor.

Fairchild is credited with the birth of Silicon Valley. The company became a major player in the growing semiconductor industry, and many Silicon Valley firms — including Intel (founded in 1968) and Apple (in 1976) — have ties to Fairchild alumni to this day.

Steve Jobs standing in front of an "Apple Computer" sign (© Tom Munnecke/Getty Images

Steve Jobs, co-founder of Apple, attends the first West Coast Computer Faire, where the Apple II computer debuted, in San Francisco in 1977. (© Tom Munnecke/Getty Images)

Steve Jobs, co-founder of Apple, attends the first West Coast Computer Faire, where the Apple II computer debuted, in San Francisco in 1977. (© Tom Munnecke/Getty Images)

Close-up of a small integrated-circuit chip with gold connectors, 1981 (© David Madison/Getty Images)
DID YOU KNOW?

The invention of the integrated circuit solidified Silicon Valley’s industry dominance. And in 1971, Intel released the first commercially available microprocessor, able to perform calculations and control devices on a single chip.

1980s–1990s
Close-up of Fujitsu 64-kilobit DRAM memory chips labeled 64K SC D-RAM, photographed in Tokyo in 1984. (© Bettmann/Getty Images)

SEMICONDUCTORS GO GLOBAL

“Every country in the world wants their own version of [semiconductor manufacturing], because it gives you a supply chain and economic and national security.”

—  Thomas Caulfield,
executive chairman of GlobalFoundries

As demand for semiconductors grew, so did the need for manufacturing capabilities.

Throughout the 1980s and 1990s, Japan, South Korea, and Taiwan became players in the industry, with Japanese companies like Toshiba and NEC influencing the data-storage market and South Korea’s Samsung and SK Hynix focusing on memory-chip production.

Modern Toshiba corporate building in Tokyo with company logo, circa 1995 (© Kaku Kurita/Gamma-Rapho/Getty Images)

Toshiba headquarters symbolize Japan’s influence on the electronics industry through the 1990s. (© Kaku Kurita/Gamma-Rapho/Getty Images)

Toshiba headquarters symbolize Japan’s influence on the electronics industry through the 1990s. (© Kaku Kurita/Gamma-Rapho/Getty Images)

Meanwhile, the Taiwan Semiconductor Manufacturing Company (TSMC) upended a traditional business model of integrating chip design and manufacturing. It introduced the fabless-foundry model, encouraging firms to specialize in either design (fabless) or fabrication/manufacturing (foundry).

This increased efficiency. What’s more, it allowed many small firms — those lacking resources to open manufacturing plants — to design chips.

Engineers push trolleys carrying wafer pods inside semiconductor fabrication plant in Taiwan, 2006. (© Sam Yeh/AFP/Getty Images)
DID YOU KNOW?

The fabless-foundry business model democratized chip production, allowing startups to enter the market without the need for expensive manufacturing facilities.

1990s–2000s
Gloved technician assembling a green circuit board (© Brownie Harris/Getty Images)

THE OFFSHORING ERA

“Three decades ago, the United States led the world in semiconductor manufacturing. The U.S. has a great base of science and technology talent. Now we need to rebuild our manufacturing base … to strengthen our leadership.”

—  Walter Copan,
former under secretary of commerce for standards and technology (
20172021)

The emergence of the internet, personal computing, and mobile communications drove unprecedented demand for semiconductors.

This demand fueled the expansion of offshore manufacturing — U.S. companies moved production to East Asia, notably Taiwan, to reduce costs. (By 2022, the U.S. share of global chip manufacturing had dropped to 10 percent, down from 37 percent in 1990.)

Person in mask inspecting silicon (© Sam Yeh/AFP/Getty Images)

A UMC technician inspects a 12-inch wafer in Tainan, Taiwan. (© Sam Yeh/AFP/Getty Images)

A UMC technician inspects a 12-inch wafer in Tainan, Taiwan. (© Sam Yeh/AFP/Getty Images)

As internet use increased worldwide and the Y2K computer glitch required attention, U.S. companies offshored software services, particularly to India.

The increasing complexity of microchips and rising fabrication costs led more companies to adopt the fabless (design-only) business model, further expanding offshore operations.

These trends transformed the semiconductor industry into a globalized landscape with complex supply chains. By the 2000s, Asia had become the center of semiconductor manufacturing.

Close-up of an Intel 300 mm silicon wafer showing colorful microchip patterns, photographed in Tokyo, 2007 (© Yoshikazu Tsuno/AFP/Getty Images)
DID YOU KNOW?

As the fabless-foundry model and supply chain expanded, the intellectual property (IP) licensing model also gained traction, allowing companies to license pre-designed functional blocks (like processor cores or memory) for their chips, reducing development time and costs.

2020s
Wide view of TSMC facility in Phoenix, Arizona, 2023 (© Caitlin O’Hara/The Washington Post/Getty Images)

THE U.S. CHIP RENAISSANCE

“Strategic investment in [chip] manufacturing will create 90,000 American jobs and advance the American semiconductor industry.”

—  Sanjay Mehrotra, president and chief executive of Micron Technology

President Trump has long prioritized the return of chip manufacturing.

Boosted by government incentives and private investment, chip manufacturing is projected to significantly strengthen the U.S. semiconductor industry by increasing domestic production, creating tens of thousands of jobs and strengthening national security and economic competitiveness.

Intel chief executive Brian Krzanich stands beside President Trump in the Oval Office while holding a silicon wafer (© Chris Kleponis/Getty Images)

Intel chief executive Brian Krzanich meets with President Trump at the White House in 2017 to announce a $7 billion investment in a new Arizona factory — one of several commitments to U.S. chip manufacturing. (© Chris Kleponis/Getty Images)

Intel chief executive Brian Krzanich meets with President Trump at the White House in 2017 to announce a $7 billion investment in a new Arizona factory — one of several commitments to U.S. chip manufacturing. (© Chris Kleponis/Getty Images)

A stronger domestic semiconductor industry will ensure a reliable supply of critical chips for defense and other essential sectors, all while fostering innovation.

Industry experts say this shift will help the U.S. lead in strategic technologies such as artificial intelligence (AI), 5G and 6G communications, and quantum computing (harnessing quantum mechanics to solve problems that even the most powerful classical computers cannot).

Micron Technology logo displayed on modern building exterior in San Jose, 2025. (© Justin Sullivan/Getty Images)
DID YOU KNOW?

With companies like Intel, Samsung, and Micron Technology now investing billions of dollars in new chip manufacturing facilities across the U.S., the total private-sector investment is estimated at more than $500 billion. The U.S. is expected to more than triple its semiconductor manufacturing capacity between 2022 and 2032, the highest growth rate in the world.

LOOKING AHEAD
Man examines components of next-generation quantum computer (© Angela Weiss/AFP/Getty Images)

THE NEXT ERA OF INNOVATION

“Quantum computing is a completely new era of computing. You could almost think of it as a new type of transistor.”

— Walker Steere, engineer at IonQ

Experts predict that quantum computing — with its ability to accelerate AI by overcoming limitations on data size, complexity, and processing speeds — will shape the future.

Quantum AI will develop algorithms that could advance pharmaceutical discoveries, predict financial outcomes, improve manufacturing, and bolster cybersecurity. Quantum/AI partnerships already comprise an active and developing market, with U.S. tech giants like IBM and Nvidia investing in both domains.

View of a quantum computing lab with equipment and cables (© Angela Weiss/AFP/Getty Images)

IBM’s quantum computing lab showcases the experimental future of American innovation. (© Angela Weiss/AFP/Getty Images)

IBM’s quantum computing lab showcases the experimental future of American innovation. (© Angela Weiss/AFP/Getty Images)

U.S. firms GlobalFoundries and IonQ are also pursuing these frontiers.

Through strategic partnerships, GlobalFoundries positions itself as a leading chip manufacturer for the quantum industry.

IonQ invests in advanced trapped-ion quantum computers, while developing hybrid models that combine quantum and classical computing to enhance AI capabilities.

Walker Steere anticipates “massive breakthroughs” in chemical and pharmaceutical research and engineering. Technology is constantly evolving, he says. “I’d much rather contribute to developing that technology than watching from the sidelines as others do.”

Close up of Google’s quantum processor (© Google)
DID YOU KNOW?

The U.S. develops powerful quantum chips like Google’s Willow chips, which dramatically reduce errors and process some complex calculations far faster than supercomputers. In medicine, quantum AI can simulate molecular interactions to accelerate the design of new drugs and improve diagnostics.

Afterword:
America's Approach to Innovation

Industry leaders point to many factors that shape U.S. technological innovation. One such factor is the U.S. system of intellectual property protection, which fosters the spirit of risk-taking, says Walter Copan. (That system is enshrined in the U.S. Constitution, thanks to the foresight of America’s Founding Fathers.)

Sanjay Mehrotra cites the U.S. business culture of “openly, freely being able to debate ideas,” adding, “The best ideas win.”

Thomas Caulfield says, “This is where you can work hard, live your dream, become an entrepreneur, start a company.”

And Jon Gertner notes that key people at Bell Labs came from humble beginnings: “To me, that feels uniquely American — the idea that talent could rise from almost anywhere and shape the future of communications.”


Suburban house and garage in Los Altos where Apple was founded, 2011 photo (© Kevork Djansezian/Getty Images)
DID YOU KNOW?

It’s part of Silicon Valley lore that massive tech empires often sprouted from humble roots. As quantum computing and AI herald the next seismic shifts in technology, innovation hubs could emerge in unlikely places. Who knows? The next great U.S. tech companies might now be incubating in a town anywhere in America.

Additional Photo Credits:
(Library of Congress/Gottscho-Schleisner), (Bell Telephone Magazine), (© James Leynse/Corbis/Getty Images), (Computer History Museum/Beckman Foundation), (© Bettmann/Getty Images), (© Roslan Rahman/AFP/Getty Images), (© Brownie Harris/Getty Images), (Courtesy of Walter Copan), (© Caitlin O’Hara/The Washington Post/Getty Images), (© Mandel Ngan/AFP/Getty Images), (© Angela Weiss/AFP/Getty Images), (Courtesy of Walker Steere)

Writer: Lauren Monsen
Photo editor: Serkan Gurbuz
Graphic designer: Buck Insley
Video project manager: Afua Riverson
Video producer:
William Leitzinger
Production editor: Kathleen Hendrix
Digital storyteller: Pierce McManus

January 2026

This site is managed by the Bureau of Global Public Affairs within the U.S. Department of State. External links to other internet sites should not be construed as an endorsement of the views or privacy policies contained therein.