Superconductors
Discovered in 1911 in liquid-helium-cooled mercury, superconductors became useful only when later materials could carry large currents in strong fields.
Zero resistance looked like a laboratory trick when Heike Kamerlingh Onnes cooled mercury in Leiden in 1911. It was actually a new physical regime. Current could flow without the steady tax that ordinary conductors always charge.
That discovery depended on a habitat humans had only just built. Helium isolation and liquefaction let Onnes reach temperatures that nature almost never supplies on Earth. In that sense superconductivity began as niche construction: change the environment enough, and matter starts behaving by different rules.
The gap between discovery and use was long. Early superconductors worked only at punishingly low temperatures and often failed in strong magnetic fields, so the phenomenon remained more scientific curiosity than engineering platform. Path dependence still set in. Once physicists knew the effect was real, they kept pushing toward lower temperatures, purer materials, and higher critical fields until the materials problem gave way.
That turning point made the superconductor an enabling substrate rather than a result. Niobium-based alloys and compounds in the postwar era finally carried enough current under enough field to support the superconducting magnet. From there the lineage ran outward into accelerators, MRI systems, and other cryogenic devices that copper could not serve efficiently. The world had to build cold before it could build useful field.
What Had To Exist First
Preceding Inventions
Required Knowledge
- Low-temperature measurement
- Electrical resistance testing
- Helium liquefaction
Enabling Materials
- Liquid helium
- Purified mercury samples
- Cryogenic glass and vacuum apparatus
What This Enabled
Inventions that became possible because of Superconductors:
Biological Patterns
Mechanisms that explain how this invention emerged and spread: