The Long and Short of It: How Frames of Reference Impact What We See and How We See It

Paul Asel
7 min readJul 12, 2022

--

Yesterday, NASA released the first full-color image from the James Webb Space Telescope (JWST). I began writing this article last December when JWST launched as the successor to the Hubble Space Telescope and have eagerly awaited the first images from the telescope for the past eight months. The images below show our universe in dynamic transition with a new star forming in the Carina Nebula and a cloud of gas around a dying star in the Southern Ring Nebula. The range of view that JWST offers on our universe is mind boggling: the Carina Nebula is 7600 light years away or 44.7 quadrillion miles, and the Southern Ring Nebula is 2000 light years away or 11.8 quadrillion miles.

JWST reminds us that our perspective defines us. It seems timely that we reconsider our frame of reference and how we see the world and our place in the universe.

Carina Nebula — new star formation region NGP 3324

Southern Ring Nebula — dying star

In 1726, Jonathan Swift published Gulliver’s Travels, which has delighted readers ever since with tales of imaginary islands visited in Gulliver’s sailing adventures. His depictions of Lilliput and Brobdignag still resonate with me decades after first reading the book. Gulliver washes ashore when shipwrecked on Lilliput and awakens to find himself pinned to the ground with stakes and string by Lilliputians the size of grasshoppers. He eventually secures his release and lands on Brobdignag where he walks antlike among grass as tall as trees and encounters farmers who are 72 feet tall.

Swift reminds us that our perspective defines us, both in how we see the world and in how others see us. Gulliver finds the Lilliputians preoccupied by trivialities. The Lilliputians are threatened by his size and plan to blind him before he escapes. On Brobdignag, Gulliver must fend off giant wasps and is carried around in a playhouse that the queen calls her traveling kit.

Our potential range of view, both in how we see ourselves and the world, has never been greater than it is today. While Swift offered an aspect ratio of 150:1 comparing Brobdignagians and Lilliputians, science today offers us an incomprehensibly expanded view of our universe. Our breadth of vision is increasingly matched by an enhanced depth of perspective.

At Brobdignagian scale, geologic time and astronomical distance offers an expansive view of earth and our universe. Geologists estimate the age of the earth at 4.6 billion years. Astronomers measure the solar system in astronomical units (1 au is about 93 million miles, the distance from earth to the sun) and the universe in light years (1 light year equals 5.88 trillion miles). The diameter of the observable universe is estimated at 93 billion light years.

While geologic time and astronomical distance expands our perspective, it shrinks our place in the universe. Human history, which started with the emergence of homo sapiens around 200,000 B.C., would be but four seconds of a geologic day if the history of earth were compressed to twenty-four hours. Considering that humans inhabit just 10% of earth, our place in the universe appears both fragile and short-lived.

In 1734, just eight years after Swift published Gulliver’s Travels, Emanuel Swedenborg speculated in The Principia that there may be galaxies beyond our own, a conjecture that persisted unproven for two centuries. Yet over the past 50 years, technology has affirmed Swedenborg’s speculation converting science fiction into science.

NASA launched two Voyager space probes in 1977 that are still active and have moved beyond our solar system. In 2013, Voyager 1 reached interstellar space traveling 37,600 miles per hour and was 14.5 billion miles from earth in November 2020. Voyager 1 is still 1.6 light years from the next closest star Gliese 445, which would require 40,000 years to reach at its current velocity of 330 million miles per year. With data from the Hubble Space Telescope, launched in 1990, scientists believe there may be 2 trillion galaxies in the observable universe.

Figure 1: Expanded view of our universe

Earth’s location in the Milky Way galaxy

Intergalactic view from Hubble Space Telescope

At Lilliputian scale, Gordon Moore observed in 1965 that transistor density doubles every two years. Moore’s Law has persisted effectively increasing our microscopic visibility 50 million times in the past half century. State of the art technology now operates in nanoseconds and nanometers. A nanosecond — one billionth of a second — is the cycle time of one gigahertz electromagnetic wave, the interpacket gap for a 100 gigabit ethernet network, and the time for light to travel one foot. Nanometers — a billionth of a meter — are the relevant scale for semiconductors, organic chemistry, molecular biology and surface science. The width of human hair is about 60,000 nanometers.

Figure 2: Semiconductor technology advances over the past fifty years

The benefits of microscopic compute power are evident everywhere. The first digital computer, the ENIAC, was about half the size of a basketball court with the compute power of 2400 humans. In 2015, Apple Watch had the processing power of two Cray-2 supercomputers, the world’s most powerful computer in 1985.

“Moore’s Law” is not a law. Gordon Moore acknowledged that transistor sizes would reach a boundary condition at the atomic level, which Intel expects to reach at 1.4 nanometers in 2029. Approaching the limits of “Moore’s Law” is both an opportunity and threat. As Thomas Kuhn noted in The Structure of Scientific Revolutions, “The significance of crises is the indication they provide that an occasion for retooling has arrived.” In 2015, the International Technology Roadmap for Semiconductors Research prepared a fifteen-year roadmap that, for the first time, includes research beyond Moore’s Law. Research on DNA computing and quantum computing is afoot to breech the atomic barrier.

Science is sometimes an uncomfortable truth, especially when scientific revolutions challenge cherished beliefs or livelihoods. Copernicus and Galileo incited the Roman Inquisition with their heliocentric theories in the 16th and 17th centuries. Christianity deemed Darwin’s theory of evolution as heretical in the 19th century. Einstein struggled to reconcile his faith with his theory of relativity in the 20th century. Enlightenment shook the foundations of religious faith yet offered a passage beyond the Middle Ages. The Industrial Revolution crowned Britain as a global power and gave Europe control over 85% of the world’s landmass but repressed the working class shaking faith in government and unleashing the scourge of Marxism. Nuclear physics extended our reach to subatomic levels, yet scientists involved in the Manhattan Project lamented use of the atomic bomb they helped create.

As it extends beyond the realm of human comprehension, science is subject to flights of fancy and spirals of skepticism. Emerging technologies have the potential to redefine our work and play, yet they raise questions about its role in society. Bill Gates and Jeff Bezos have called artificial intelligence promising yet more dangerous and scary than nuclear energy. Calling artificial intelligence “our biggest existential threat”, Elon Musk launched OpenAI as an open source AI project for beneficial applications.

Yet science and technology are amoral and apolitical. Science is not a question of right or wrong but rather how we use the technology. Technology is troubling only because its prospects extend beyond the capacity of humans and governments to control how it is used.

As Roy Amara observed, we tend to overestimate the impact of technology in the short term and underestimate its impact in the long term. Leading physicists in the early 20th century believed that physics had reached its apogee just before Einstein emerged with his theory of relativity, the dawn of subatomic physics and creation of the atomic bomb. Ken Olsen, founder of Digital Equipment Corporation, denied the prospect of home computers, then the discovery of the transistor put computers on every desktop and many of our wrists. As Figure 3 suggests, Middle Age savants could not have anticipated the Industrial Revolution nor industrialists the information age.

Figure 3: Expanding purview of science from the Middle Ages to the Information

The question is not whether we pursue scientific progress but how we use it. The atomic bomb shook a generation and responsible governments learned to limit its use even in the darkest hours of the Cold War. Artificial intelligence, quantum computing and genomics have the potential to alter the course of humankind. Whether this is good or bad depends on our frame of reference. Governments and scientists must mitigate risks while harnessing the opportunities they present.

--

--

Paul Asel

Managing Partner @ngpcapital, a global VC with $1.6B AUM. Portfolio: Lime, Zum, SVT, Workfusion ... Writes about innovation, VC, AI, entrepreneurship.