The Remarkable Story Behind The Most Important Algorithm Of All Time
Summary
TLDRThe script explores the significance of the Fast Fourier Transform (FFT), a pivotal algorithm in signal processing used in technologies from WiFi to nuclear detection. It delves into the history of nuclear arms control efforts and how FFT could have influenced them. The video also highlights the discovery and application of FFT in various fields, emphasizing its importance in modern society and the potential impact of career choices on global issues.
Takeaways
- 🌟 The Fast Fourier Transform (FFT) is a crucial algorithm used in various applications including signal processing, radar, sonar, 5G, and WiFi.
- 🔍 FFT's significance extends to its role in potentially curbing the nuclear arms race, as it was instrumental in detecting covert nuclear weapons tests.
- 💡 The algorithm was discovered by scientists during efforts to detect nuclear tests, and its earlier discovery could have influenced the course of nuclear disarmament.
- 🕊️ Post-World War II, the U.S. proposed the Baruch Plan for international control of nuclear materials, but it was rejected by the Soviets, leading to the nuclear arms race.
- ⏳ The Partial Test Ban Treaty of 1963 banned nuclear tests in the atmosphere, underwater, and space, but not underground, due to difficulties in verifying compliance.
- 🌐 The challenge in detecting underground nuclear tests lies in distinguishing between seismic activity from natural earthquakes and man-made explosions.
- 📊 A Fourier transform decomposes a signal into its constituent frequencies, aiding in the analysis of seismometer signals to identify nuclear tests.
- 🔢 The Fast Fourier Transform algorithm reduces the computational complexity from N squared to NlogN, making it feasible to process large datasets quickly.
- 📚 Carl Friedrich Gauss had discovered a form of the FFT in 1805, but it wasn't published in a widely accessible form, delaying its impact until the 20th century.
- 🛰️ The FFT is fundamental to modern signal processing, including image and audio compression, and has been called the most important numerical algorithm of our lifetime by mathematician Gilbert Strang.
- 🚀 The video also promotes the nonprofit 80,000 Hours, which helps individuals find careers that can make a significant positive impact on pressing global issues.
Q & A
What is the Fast Fourier Transform (FFT)?
-The Fast Fourier Transform (FFT) is an algorithm that efficiently computes the Discrete Fourier Transform (DFT) of a sequence and its inverse. It breaks down a signal into its constituent frequencies, simplifying the analysis and processing of various signals, and is widely used in signal processing, including audio, image, and data compression.
Why is the FFT considered so important?
-The FFT is considered important because it is foundational in many areas of technology and science, including signal processing, radar, sonar, 5G, WiFi, and even in the detection of nuclear tests. Its efficiency in handling large amounts of data has made it a critical tool for modern computing and communication systems.
What was the Baruch plan and why was it significant?
-The Baruch Plan was a proposal by the United States after World War II to decommission all their nuclear weapons if other nations pledged never to make them. It aimed to prevent the proliferation of nuclear weapons and was significant because it showed the U.S.'s willingness to engage in disarmament discussions, although it was ultimately rejected by the Soviets.
How did the nuclear arms race begin?
-The nuclear arms race began after the rejection of the Baruch Plan by the Soviet Union. They saw it as an attempt to maintain American dominance in nuclear weapons. This led to both superpowers developing and testing new nuclear weapons, escalating the arms race.
What was the purpose of the 1958 Geneva Conference on the Discontinuance of Nuclear Weapon Tests?
-The purpose of the 1958 Geneva Conference was to negotiate a comprehensive test ban to prevent any state from testing nuclear weapons. It was an attempt to reduce the proliferation of nuclear weapons and the associated risks.
Why was it difficult to detect underground nuclear tests?
-Underground nuclear tests are difficult to detect because their radiation is mostly contained within the earth, and the seismic activity produced is similar to natural earthquakes. Additionally, the Soviets refused onsite compliance visits, making it challenging to verify compliance with a test ban.
What role did the FFT play in the detection of covert nuclear tests?
-The FFT played a crucial role in analyzing seismometer signals to distinguish between natural earthquakes and underground nuclear tests. By decomposing the signals into their frequency components, scientists could identify the unique signatures of nuclear explosions.
Who discovered the Fast Fourier Transform and when was it rediscovered?
-The Fast Fourier Transform was originally discovered by Carl Friedrich Gauss in 1805, but it was not widely adopted due to its publication posthumously in a non-standard notation. It was rediscovered in the 20th century by James Cooley and John Tukey, who published the algorithm in 1965, which then gained widespread use.
How did the FFT contribute to the field of image compression?
-The FFT contributes to image compression by transforming images into the frequency domain, where most natural images have a lot of high-frequency components that are close to zero. By discarding these high-frequency components and inverting the transform, a compressed version of the image can be obtained with minimal loss of quality.
What is the significance of the 80,000 Hours organization mentioned in the script?
-80,000 Hours is a nonprofit organization that helps people find fulfilling careers that make a positive impact on the world. They provide resources such as research-backed guides, a job board, and a podcast to assist individuals in making career choices that align with their desire to do good.
Outlines
🌟 The Importance of FFT in Modern Technology
This paragraph introduces the Fast Fourier Transform (FFT) as a crucial algorithm with widespread applications in technology, including video streaming, radar, sonar, 5G, and WiFi. It also delves into the historical context of the FFT's discovery, linking it to the efforts of scientists trying to detect covert nuclear weapons tests. The narrative suggests that had the FFT been discovered earlier, it might have influenced the course of the nuclear arms race, potentially preventing it. The paragraph also touches on the aftermath of the atomic bombings of Hiroshima and Nagasaki, the subsequent nuclear arms race, and the attempts by the U.S. and other nations to control nuclear proliferation through international agreements like the Baruch Plan.
🌐 Challenges in Detecting Nuclear Tests and the Role of Seismometers
The second paragraph discusses the difficulties in detecting underground nuclear tests, as opposed to atmospheric or underwater tests, which are easier to identify due to the dispersal of radioactive isotopes or sound waves. It highlights the 1963 Partial Test Ban Treaty that prohibited testing in the atmosphere, underwater, and in space but not underground, due to the inability to verify compliance. The paragraph introduces the idea of using seismometers to detect the ground vibrations caused by nuclear explosions and the challenge of distinguishing these signals from natural seismic activity. It sets the stage for the necessity of a method to analyze these signals, leading to the importance of Fourier transforms in signal processing.
📊 Understanding Fourier Transforms and Their Application in Signal Analysis
Paragraph three provides an in-depth explanation of Fourier transforms, which are essential for breaking down complex signals into their constituent sine waves. It describes the process of determining the presence and amplitude of different frequencies in a signal, which is crucial for distinguishing between seismic events and nuclear tests. The paragraph also touches on the historical development of the Fourier transform and its application in various fields, including the analysis of seismometer signals. It introduces the concept of the Discrete Fourier Transform (DFT), which is necessary for processing real-world, finite signals, and discusses the limitations of DFT in terms of computational requirements, setting the stage for the introduction of the Fast Fourier Transform (FFT).
🚀 The Discovery and Impact of the Fast Fourier Transform (FFT)
The fourth paragraph narrates the discovery of the Fast Fourier Transform (FFT) by John Tukey, which significantly reduced the computational complexity of performing Fourier transforms. It details how FFT was kept secret for its initial application in detecting underground nuclear tests during the Cold War. The paragraph also discusses the implications of FFT's late adoption in potentially preventing the escalation of the nuclear arms race. It highlights the broader applications of FFT in various fields, emphasizing its importance as a foundational algorithm for signal processing.
🌍 The Legacy and Potential of FFT in Global Security and Technology
The fifth paragraph reflects on the historical significance of the FFT, discussing how its earlier discovery by Carl Friedrich Gauss, but subsequent obscurity, may have impacted the trajectory of global events, including the nuclear arms race. It contemplates the 'what if' scenario of a comprehensive test ban treaty had the FFT been available earlier. The paragraph also underscores the transformative impact of FFT on modern technology, including data compression and signal processing, and recognizes its importance in various scientific and technological advancements.
💼 The Value of Career Impact and the Role of 80,000 Hours
The final paragraph transitions from the technical discussion of FFT to the broader theme of career impact, introducing the nonprofit organization 80,000 Hours. It discusses the importance of choosing a career that aligns with one's desire to make a positive difference in the world. The paragraph highlights the resources provided by 80,000 Hours, such as career guides, a job board, and a podcast, aimed at helping individuals find fulfilling careers that address global challenges. It concludes with an encouragement for viewers to explore these resources and make informed decisions about their career paths.
Mindmap
Keywords
💡Fast Fourier Transform (FFT)
💡Nuclear Arms Race
💡Seismometer
💡Signal Processing
💡Discrete Fourier Transform (DFT)
💡Nuclear Test Ban Treaty
💡Radar and Sonar
💡5G and WiFi
💡Compression Algorithms
💡Richard Garwin
💡Carl Friedrich Gauss
Highlights
The Fast Fourier Transform (FFT) is considered the most important algorithm of all time due to its widespread use in signal processing.
FFT is utilized in technologies such as radar, sonar, 5G, and WiFi, and is fundamental in processing any type of signal.
The algorithm was originally discovered by scientists attempting to detect covert nuclear weapons tests, potentially impacting the nuclear arms race.
Post-World War II discussions led to the Baruch Plan, proposing international control over nuclear materials, but it was rejected by the Soviets.
Nuclear testing, especially in remote locations, resulted in significant environmental and health impacts, prompting public outcry and calls for a test ban.
The 1958 Geneva Conference on the Discontinuance of Nuclear Weapon Tests marked a pause in nuclear testing during negotiations.
Detecting underground nuclear tests is challenging due to the containment of radiation and the difficulty in verifying compliance.
Seismometers and Fourier transforms were considered as methods to detect underground detonations by analyzing ground vibrations.
A Fourier transform decomposes a signal into sine waves, revealing the presence of different frequencies and their amplitudes.
The Discrete Fourier Transform (DFT) is used for real-world signals composed of finite data points, resulting in a discreet frequency spectrum.
The Fast Fourier Transform algorithm reduces the computational complexity from N squared to N log N, making it practical for large data sets.
The FFT was independently discovered by Carl Friedrich Gauss in 1805 but was not published until after his death, delaying its impact.
James Cooley and John Tukey's 1965 publication of the FFT algorithm revolutionized signal processing and had various applications.
The FFT is fundamental in data compression algorithms, facilitating the storage and transmission of images and videos.
Gauss's potential earlier publication of the FFT could have influenced the nuclear arms race and the adoption of a comprehensive test ban.
The FFT's applications extend to solving differential equations, studying crystal structures, and enabling technologies like WiFi and 5G.
Gilbert Strang, a mathematician, regards the FFT as the most important numerical algorithm of our lifetime due to its extensive use.
The video is sponsored by 80,000 Hours, a nonprofit organization helping individuals find impactful careers.
Transcripts
This is a video about the most important algorithm of all time,
the Fast Fourier Transform or FFT.
I mean, you use it all the time
including right now to watch this video
and it's used in radar and sonar, 5G and WiFi.
Basically, anytime a signal is processed,
there's a good chance that a Fast Fourier Transform
is involved.
But on top of all this, the FFT was discovered by scientists
trying to detect covert nuclear weapons tests.
And if they had discovered it sooner,
it may have put a stop to the nuclear arms race.
You know I always assumed that the nuclear arms race
was inevitable, that once the U.S. dropped atomic bombs
on Hiroshima and Nagasaki,
it was just unavoidable that all the other major powers
would develop nukes.
But maybe it wasn't because it was clear to everyone
that nuclear weapons were game changing.
The bomb dropped on Hiroshima released a thousand times
more energy than the biggest conventional explosive
so other countries were rightfully concerned.
At the end of the war, Canada and the U.K.
requested a meeting to discuss what would be done
about nuclear weapons.
And contrary to what I expected,
the U.S. was actually open to these discussions.
They realized that they wouldn't be the only nation
with nukes for long.
So they offered to decommission all their nuclear weapons
if other nations would pledge never to make them.
This was known as the Baruch plan
and it proposed that an international body would control
all radioactive materials on earth,
from mining to refining to using these materials
for peaceful purposes as a nuclear power generation.
But the Soviets rejected the proposal.
They saw it as just another ploy to maintain
American nuclear dominance
and so the global nuclear arms race began.
To develop new nuclear weapons required extensive testing
and most of it was done in remote places like the Arctic
or the South Pacific Islands.
The U.S. also created a nuclear testing site in Nevada,
the radioactive products of which blew across the country
so that outside of Japan, the people most adversely affected
by American nuclear weapons were Americans themselves.
The U.S. soon graduated from fission weapons
to thermonuclear bombs which combined fission and fusion
to release even more energy.
They were as big leap from the first atomic bombs
as those devices were from conventional explosives,
a thousand times again more powerful.
- Annihilation of any life on earth has being brought
within the range of technical possibility.
- [Narrator] In 1954 at Bikini Atoll in the South Pacific,
the U.S. tested a new thermonuclear design
that used lithium deuteride as a fuel
and they expected the device code named Shrimp
to release the energy equivalent of six million tons of TNT
but they were wrong.
(bomb blasts)
It released two and a half times that
due to unanticipated reactions with lithium-7.
And as a result, more radioactive material was created
and it rained down over a much larger area than planned.
Residents of neighboring atolls were only evacuated
three days later suffering from radiation sickness.
And further east, the 23 crew of a Japanese fishing boat
were caught in a flurry of radioactive white ash.
And by that evening,
they were suffering from acute radiation syndrome.
One of the crew died from ensuing complications
six months later.
- [Reporter] As a result of the extended illness
brought about by his exposure to the radiation fallout.
- These events triggered public outcry
against nuclear testing.
The modern peace sign was designed in the 1950s
by combining the semaphore letters for N and D
standing for nuclear disarmament.
A number of world leaders called for a comprehensive
test ban, no more testing of nuclear weapons by any state
and this call was actually taken seriously
by the world's nuclear powers.
They entered into negotiations at meetings
with very literal names like
the Conference on the Discontinuance of Nuclear Weapon Tests
held in Geneva in 1958.
And to show just how serious they were,
they even stopped all testing during the negotiations
which is why 1959 is the only year in a long stretch
when no nuclear weapons were detonated.
But there was a big problem with negotiating
a comprehensive test ban which is how do you know
that the other side will hold up their end of the bargain?
The U.S. worried that the Soviets would continue
testing covertly and leapfrog them technologically
and the Soviets similarly distrusted the U.S..
So to address these concerns,
they convened the Conference of Experts to Study
the Possibility of Detecting Violations
of a Possible Agreement on Suspension of Nuclear Tests.
Seriously, that was the name, I am not making this up.
Now detecting atmospheric tests was fairly straightforward.
The radioactive isotopes they produced disperse
in the atmosphere and can be detected
thousands of kilometers away.
Underwater tests produce distinctive sounds
that travel efficiently around a kilometer below the surface
of the ocean and these can be picked up by hydrophones.
But underground tests are much harder to detect from afar
because their radiation is mostly contained
and the Soviets refused to allow onsite compliance visits
which they regarded as espionage.
And this is ultimately why when a test ban treaty was signed
in 1963, it was only a partial ban.
It banned testing underwater,
in the atmosphere and in space,
places where compliance could be verified,
but it didn't ban testing underground for the simple reason
that it was almost impossible to verify compliance.
But scientists had been trying to find a way
to reliably detect underground detonations.
Following the Geneva meeting, American and Soviet scientists
formed a working group to discuss the issue
and their idea was to use seismometers located
outside the countries where nukes were being tested
to detect the faint ground vibrations caused by explosions.
The problem was how do you distinguish a nuclear test
from an earthquake?
Every day around the world,
there are close to 300 earthquakes
with a magnitude of three or greater.
In addition, part of the purpose of detecting
your adversaries tests is to spy on them,
to know how big of an explosion they can make.
But the seismometer signal depends not only on the yield
of the device but also on how deep it was buried.
For a given yield, deeper explosions appear smaller.
So scientists wanted a method to reliably determine
whether a given signal was a bomb or an earthquake
and if it was a bomb, how big it was
and how deep it was buried.
They knew that all this information was contained
in the seismometer signal but it couldn't just be read off
by looking at the squiggles.
They needed to know how much of different frequencies
were present which meant they needed to take
a Fourier transform.
A Fourier transform is a way of decomposing a signal
into pure sine waves, each with its own amplitude
and frequency that add to make it up.
This seems a bit like magic since the sum
of a set of sine waves can look arbitrarily complicated
and nothing like its component parts.
There are some elegant ways to understand
how Fourier transforms work,
shout out to the awesome video by 3Blue1Brown.
But I wanna take more of a brute force approach.
If you wanna know how much of a particular sine wave
is in a signal, just multiply the signal by the sine wave
at each point and then add up the area under the curve.
As a simple example, say our signal is just a sine wave
with a certain frequency but pretend we don't know that
and we're trying to figure out
which sine waves add to make it up.
Well, if you multiply this signal by a sine wave
of an arbitrary frequency, the two waves are uncorrelated.
You're just as likely to find places where they have
the same sign, both positive or both negative
as where they have opposite signs and therefore
when you multiply them together, the area above the x-axis
is equal to the area below the x-axis.
So these areas add up to zero which means
that frequency sine wave is not a part of your signal
and this will be true for almost all frequencies
you could try assuming you're looking over
a long enough timeframe.
The only exception is if the frequency of the sine wave
exactly matches that of the signal,
now these waves are correlated
so their product is always positive and so is the area
under the curve that indicates
that this sine wave is part of our signal.
And this trick works even if the signal is composed
of a bunch of different frequencies.
If the sine waves frequency is one of the components
of the signal it will correlate with the signal
producing a non-zero area.
And the size of this area tells you the relative amplitude
of that frequency sine wave in the signal.
Repeat this process for all frequencies of sine waves
and you get the frequency spectrum.
Essentially which frequencies are present
and in what proportions.
Now so far I've only talked about sine waves
but if the signal is a cosine wave,
then even if you multiply it by a sine wave
of the exact same frequency,
the area under the curve will be zero.
So for each frequency,
we actually need to multiply by a sine wave
and a cosine wave and find the amplitudes for each.
The ratio of these amplitudes indicates the phase
of the signal that is how much it's shifted
to the left or to the right.
You can calculate these sine and cosine amplitudes
separately or you can use Euler's formula
so you only need to multiply your signal
by one exponential term.
Then the real part of the sum is the cosine amplitude
and the imaginary part is the sine amplitude.
In the American delegation at the Geneva meeting,
there was a physicist, Richard Garwin
and a mathematician John Tukey.
They got into a debate with the Soviet delegation
over which nation's seismometers were superior.
So Garwin simulated the responses of both countries' devices
on a computer at CERN.
The next day, everyone agreed there wasn't much difference.
The real obstacle to detecting underground explosions
wasn't the accuracy of the seismometers,
it was the vast amounts of computation required
to Fourier transform the seismometer signals.
Here's an example seismometer signal
and its Fourier transform.
Thus far I've been thinking about signals
as infinite continuous waves
and when you take their Fourier transform,
you get an infinite continuous frequency spectrum.
But real world signals are not like that.
They are finite and made up of individual samples
or data points.
Even though a seismometer signal looks smooth
and continuous, it doesn't record ground motion
with infinite precision.
There is some fundamental graininess to the data
so what you have is discreet finite data.
So you can't use the idealized Fourier transform.
Instead, you have to perform something
called a Discreet Fourier Transform.
And one of the distinguishing features
of a Discreet Fourier Transform
is that the frequency spectrum is also discreet and finite.
You can think of the frequencies
as falling into a finite number of bins.
And what determines the number and size of these bins
is the number of samples in the signal
and how closely spaced they are.
For example, the more spaced out the samples are,
the lower the maximum frequency you can measure.
Because the samples aren't close enough together
to capture high frequency oscillations,
the shorter the duration of the signal,
the harder it is to tell similar frequencies apart.
So this lowers the frequency resolution.
The shorter the signal, the wider each frequency bin is.
The lowest non-zero frequency you can effectively measure
is one whose period is equal to the duration of the signal.
And the higher frequency bins are just integer multiples
of this frequency.
So they fit two, three, four, and so on periods
in the duration of the signal.
The total number of frequency bins is equal
to the number of samples in the signal.
So if the signal is made up of eight samples,
then the transform has eight frequency bins
going from zero up to seven times the fundamental frequency.
So let's do an example.
Say we have a signal made up of eight samples.
Well, then the discrete Fourier transform
will have eight frequency bins.
The first bin corresponds to a frequency of zero.
Essentially this measures if the signal
is systematically shifted off the x-axis.
You can think of it as measuring the DC offset.
You multiply each data point by one
and add them all together, in this case, they add to zero.
The second frequency bin corresponds to the frequency
that fits one period in the duration of the signal.
So in this case that corresponds to one hertz.
You multiply each point by a sine wave of this frequency
and a cosine wave of this frequency
and separately add them up.
For sine, they add to zero.
For cosine, they add to four
and then repeat the process for two hertz, three hertz,
and so on, up to seven hertz.
And you have the discreet Fourier transform
of this really simple signal.
Now this process works fine in principle
and you could use it to calculate
all discrete Fourier transforms.
But the problem is it requires way too many calculations.
To complete one discrete Fourier transform
requires multiplying N data points
by N different frequency waves.
So N squared complex multiplications.
Now this is doable for eight samples
but if you had say a million samples,
that would require a million squared
or one trillion calculations.
At the speed of 1960s computers,
that would take over three years to complete,
all for a single transform.
Now a million samples is admittedly more than you would need
for a single seismic event but to analyze tens of events
per day from hundreds of seismometers,
it would just be far too time consuming
and that's what gave scientists the impetus
to look for a better way.
And the breakthrough came in 1963 at a meeting
of the President's Science Advisory Committee.
President John F. Kennedy was there,
as were Garwin and Tukey,
the physicist and mathematician from the Geneva meeting.
Although they were discussing issues of national importance
like the Apollo program and nuclear fallout shelters,
the meeting was apparently pretty boring.
Garwin observed Tukey doodling throughout
but what he was actually doing
was working on discreet Fourier transforms.
At the end of the meeting, Garwin asked Tukey
what he had worked out and he was shocked to learn
that Tukey knew a way to compute discreet Fourier transforms
with many fewer computations.
It would mean that the calculation that would've taken
over three years could be done in just 35 minutes.
This has aptly become known as the Fast Fourier Transform.
So here is how it works.
Consider the example from before with eight samples.
Each of those data points must be multiplied
by the eight different frequency waves.
Here I'm only showing cosines for simplicity.
So you would expect this to require eight times eight
or 64 complex multiplication.
But due to the periodic nature of sinusoids,
these waves of different frequencies
overlap in a predictable way.
For example, at the middle data point,
all of the four odd frequencies have the same value
and all of the four even frequencies
have the same value as well.
So instead of doing eight multiplication,
you only need to do two.
And this sort of duplication occurs
at the other data points as well.
So instead of performing 64 calculations, you need only 24.
Now that might seem like a small improvement
but it's the difference between a calculation
that scales as N, the number of samples squared
versus one that scales as Nlog base two of N
which means the bigger the data set,
the greater the savings.
A signal with a thousand samples would require
100 times fewer calculations and a million samples
would require 50,000 times fewer calculations.
But how do you know which calculations are redundant?
Well, take your samples and split them
into even and odd index points.
You still need to multiply
each of these by the eight different frequency waves.
But now let's look only at the even points and compare
the first four frequencies to the second four frequencies.
And what you find is that in each case
at the location of the samples,
the values of the two sine waves are the same.
A similar pattern can be observed for the odd index points
except the values of one sine wave are the negative
of the other.
More generally, they're related by a complex number.
But what this means is that you don't have to do
all the multiplication for the second half
of the frequencies.
Once you've calculated the odd and even sums
for the lower half the frequencies,
you can reuse these values to find the upper half.
So you've effectively cut the number of calculations
required in half but that's only a factor of two,
how do you get down to Nlog base two of N?
Well, you repeat the same trick,
split the samples again into even an odd points
and then again repeatedly
until you're down to single data points.
At each split, you can exploit the symmetries
of sinusoidal functions to cut the number of calculations
in half.
And that is how the Fast Fourier Transform reduces
N squared calculations down to NlogN.
And it's why today whenever anyone takes
a Fourier transform, it's almost always done
as a Fast Fourier Transform.
Garwin approached an IBM researcher, James Cooley
to program a computer to run the FFT algorithm
but he didn't tell him the reason was to detect
underground Soviet nuclear tests.
He said it was to work out the spins
in a crystal of helium-3.
Cooley and Tukey published the algorithm
in a seminal 1965 paper and its use immediately took off
but it was too late to secure
a comprehensive nuclear test ban.
By that time, the U.K., France and China
had joined the Soviet Union and the U.S. as nuclear powers.
And the partial test ban treaty,
far from deescalating the nuclear arms race
just sent it underground.
The thinking was if you were only allowed
to test underground, then you better be testing extensively
so as not to fall behind all the other nuclear states.
So from 1963, 1,500 nuclear weapons were detonated.
That's roughly one a week for 30 years.
This testing facilitated the construction
of an absurd number of nukes.
At the peak in the mid 1980s,
70,000 nuclear warheads were in existence.
Total expenditure on nukes in the 20th century
is estimated at around $10 trillion each for the U.S.
and the Soviet Union adjusting for inflation.
If only scientists had been confident in their ability
to remotely detect underground tests in the early 1960s,
then a comprehensive test ban could have been reached,
stopping the nuclear arms race before it really got going.
To check how realistic this is,
I asked Richard Garwin himself.
- Well, it's a good story.
- It is a good story.
- It would've helped and it might have turned the tide.
- But that would've required an earlier discovery
of the Fast Fourier Transform and as luck would have it,
it was discovered earlier but then forgotten.
All the way back in 1805, Mathematician Carl Friedrich Gauss
was studying the newly discovered asteroids
of Pallas, Ceres and Juno.
To determine the orbit of Juno,
Gauss devised a novel approach to harmonic analysis
and he jotted it down in his notes
but later used a different method
and he never thought to publish that first insight.
Today we can see that Gauss had figured out
the discreet Fourier Transform before Joseph Fourier himself
published it in 1807.
And he had developed the same Fast Fourier Transform
algorithm as Cooley and Tukey a century and a half earlier.
The reason his breakthrough was not widely adopted
was because it only appeared after his death in volume three
of his collected works and it was written with non-standard
notation in a 19th century version of Latin.
What do you think would've happened if Gauss had realized
the significance of his discovery and had published it
in a way that others could understand?
With our modern network of seismometers
and computing and the Fast Fourier Transform,
today, we have the ability to detect magnitude three events
which correspond to a one kilo ton or so explosion,
basically anywhere on earth.
Following Cooley and Tukey's paper, the use of FFTs exploded
and they are the basis for most compression algorithms
like the ones that allow you to watch
and listen to this video.
Here's how the Fast Fourier Transform
allows you to compress an image.
You take the pixel brightness values for each row
of an image and perform a Fast Fourier transform.
So essentially, you're figuring out what frequencies
are present in the brightness values
of the rows of an image.
Here, brightness represents the magnitude
of each frequency component and the color
represents the phase, how shifted that frequency is
to the left or the right.
And then you perform another FFT for the columns of pixels.
It doesn't matter if you do the rows or columns first,
which you need is a two dimensional FFT
of the original image.
For almost all real world images,
you find that a lot of the values in the transform
are close to zero.
I've taken the log of the transform values
so you can see them but if I don't, then it's clear
most of the values, especially toward the edges
are very small and these correspond to high frequencies.
And what this means is that you can throw out
a lot of the information in the transformed image
say 99% of it.
But when you invert that result, you still get a fairly good
representation of the original image.
So on your computer, most of the images will be saved
in this form as a two dimensional Fast Fourier Transform
and then when you wanna look at the picture again,
the computer simply inverts the transform.
There are so many applications for FFTs
from solving differential equations to radar and sonar,
studying crystal structures, WiFi and 5G.
Basically all kinds of signal processing use FFTs
and that's why mathematician Gilbert Strang called the FFT
the most important numerical algorithm of our lifetime.
If only it had become more widely adopted
after Gauss discovered it,
the FFT may have even more dramatically
transformed our world.
Now I don't think Gauss could ever have imagined
how important the FFT would be,
just as most people don't think about the cumulative impact
of their life's work.
You know in an average career, you work 40 hours a week,
50 weeks a year for 40 years
and that works out to 80,000 hours.
It means that your career might be your biggest opportunity
to make a difference in the world.
So what do you wanna do with all that time?
Now the name of this video sponsor is 80,000 Hours
referencing the amount of impact you can have,
the amount of hours you spend at work,
and they are not selling anything.
80,000 Hours is a nonprofit that helps people
find fulfilling careers that make a positive impact.
The typical advice from career centers or aptitude tests
really just focuses on finding a job
that fits your personal preferences.
But what if you care more about doing good?
Well, they won't really know what to tell you
besides a few well known professions
like being a doctor or a teacher.
When I finished college, I knew I liked filmmaking, teaching
and science and that I wanted to have a big positive impact
but YouTube didn't exist yet and so I honestly had no idea
what I was gonna do.
Now I feel really fortunate to be able to do something
every day that I both enjoy
and which makes a positive impact on the world.
So believe me, there are a lot of things out there
that you can do and 80,000 Hours offers tons of resources
to help you find those things.
From research backed guides on how to pick a career
that tackles pressing issues
to a regular newsletter and podcast.
They even have a curated job board that's kept up to date
with hundreds of jobs they think will make an impact.
They have done over 10 years of research alongside academics
at Oxford University on how to find a career that is both
fulfilling and which makes a positive difference.
So their recommendations are accurate, specific,
and actionable.
If you care about what the evidence says
about having an impactful career and you want real advice
that goes beyond empty cliches like follow your passion,
then check out 80,000 Hours.
Everything they provide from their podcast
to their job board is free forever.
If you join the newsletter right now,
you'll also get a free copy of their in-depth career guide
sent right to your inbox.
So to get started on a career that tackles
the world's most pressing problems,
sign up now at 80000hours.org/veritasium.
I will put that link down in the description.
So I wanna thank 80,000 Hours
for sponsoring this part of the video
and I wanna thank you for watching.
浏览更多相关视频
Understanding the Discrete Fourier Transform and the FFT
Understanding Power Spectral Density and the Power Spectrum
Vivado HLS Example: FFT
Submarines Are WAY Scarier Than You Think...Here's Why
Advances in Technology and Exchange After 1900 [AP World History] Unit 9 Topic 1 (9.1)
THE ATOM: Part Two. What Is an Atom?
5.0 / 5 (0 votes)