mod04lec19 - NISQ-era quantum algorithms
Summary
TLDRIn this video, Shesha Raghunathan from IBM Systems discusses the evolution and importance of variational quantum algorithms in the NISQ (Noisy Intermediate-Scale Quantum) era. Highlighting the limitations of current quantum hardware, such as noise and qubit count, Raghunathan explains how variational algorithms like VQE and QAOA are structured to fit within these constraints. These hybrid algorithms leverage classical hardware for optimization while performing quantum calculations for tasks like energy evaluation. The talk also touches on the potential applications of these algorithms in quantum machine learning, optimization problems, and more, emphasizing the growing interest and development in this field.
Takeaways
- 📚 The speaker, Shesha Raghunathan, is an IBM Quantum Distinguished Ambassador and leads the ambassador program in India and South Asia.
- 🌟 The talk focuses on modern quantum algorithms, specifically variational algorithms, and their potential applications.
- 🔍 Quantum computing has evolved through three broad generations, starting from Richard Feynman's conceptualization in 1981 to the current era of NISQ (Noisy Intermediate-Scale Quantum) computers.
- ☁️ IBM's release of quantum machines on the cloud in 2016 marked a significant shift, making quantum computing more accessible and sparking increased interest in programming quantum hardware.
- 🔧 The NISQ era, coined by John Preskill in 2007, refers to quantum computers that are noisy and have a limited number of qubits, challenging developers to create algorithms that can provide value despite these constraints.
- 🚀 Variational quantum algorithms, such as VQE and QAOA, emerged in 2014 and gained traction post-2016, aligning with the hardware limitations of the time by focusing on shorter circuit depths.
- 📈 The lifetime of superconducting qubits has exponentially increased over the last 15-20 years, with recent advancements pushing towards millisecond lifetimes, allowing for more complex computations.
- 💡 Variational algorithms are hybrid, utilizing both classical and quantum computing. They are well-suited for the current NISQ hardware, which has constraints on circuit depth and noise levels.
- 🌐 Real-world applications of variational quantum algorithms are being explored, including quantum machine learning, option pricing, and battery optimization.
- 🛠️ The current state of quantum hardware, as of July 2021, shows average qubit lifetimes around 100-120 microseconds, with readout errors dominating over gate errors, emphasizing the need for compact and shallow quantum algorithms.
Q & A
Who is Shesha Raghunathan and what is his role at IBM?
-Shesha Raghunathan is part of IBM Systems and works with the Electronic Design Automation team, particularly on timing analysis. He is also an IBM Quantum Distinguished Ambassador, a Qiskit Advocate, and a Technical Ambassador, leading the ambassador program in India and South Asia.
What is the significance of the year 1981 in the context of quantum computing?
-The year 1981 is significant because it marks the starting point when Richard Feynman contextualized quantum computing in a more modern form factor.
Why is the year 2016 considered a turning point for quantum computing?
-2016 is considered a turning point because that's when IBM put its quantum machine on the cloud for public access, along with a programming platform to program that hardware, which revolutionized the accessibility and programming of quantum computers.
What does the term NISQ stand for and who coined it?
-NISQ stands for Noisy Intermediate-Scale Quantum. The term was coined by John Preskill in 2007 to describe quantum computers that are noisy and have a limited number of qubits.
What is the difference between traditional quantum algorithms and those developed for NISQ-era hardware?
-Traditional quantum algorithms assume qubits are clean and error-free, whereas NISQ-era algorithms are designed to work with noisy qubits and take into account the hardware's limitations, such as a small number of qubits and noise.
What are variational quantum algorithms and why are they important for NISQ-era hardware?
-Variational quantum algorithms are hybrid algorithms that use both quantum and classical computing to solve problems. They are important for NISQ-era hardware because they are designed to be compact and shallow, fitting within the hardware's time and error constraints, and can potentially demonstrate quantum advantage.
What is Quantum Volume and what does it indicate about a quantum computer's capabilities?
-Quantum Volume is a measure of the power of a quantum computer, taking into account the number of qubits, the error rates, and the connectivity of the qubits. A higher Quantum Volume indicates a more capable quantum computer.
What is the current state of qubit lifetimes in quantum computers as of the script's reference date?
-As of July 10th, 2021, the average qubit lifetime in quantum computers is around 100-120 microseconds, with some experimental qubits reaching milliseconds.
What are the average error rates for the quantum computers mentioned in the script?
-The average CNOT error rate is around 0.1 percent, and the readout error rate is around 1 percent for the quantum computers mentioned in the script.
How do variational algorithms fit into the limitations of current quantum hardware?
-Variational algorithms are structured as hybrid algorithms with a classical component for optimization and a quantum component for computation. This structure allows them to perform well within the current hardware limitations, such as short qubit lifetimes and error rates.
What are some potential applications of variational quantum algorithms mentioned in the script?
-Some potential applications of variational quantum algorithms include quantum machine learning, option pricing, and battery optimization.
Outlines
🌟 Introduction to Quantum Algorithms and NISQ Era
The speaker, Shesha Raghunathan, introduces the topic of modern quantum algorithms, specifically variational algorithms, and their potential applications. Shesha is an IBM Quantum Distinguished Ambassador and has a background in computer architecture from the University of Southern California. The discussion begins with an overview of quantum computing generations, starting from Richard Feynman's conceptualization in 1981 to the current era, marked by IBM's release of quantum machines on the cloud in 2016. This shift made quantum computing more accessible, leading to increased interest in programming quantum hardware. The era of Noisy Intermediate-Scale Quantum (NISQ) computing is characterized by the presence of noise in qubits and the limited number of qubits available, prompting the need for algorithms that can operate effectively within these constraints.
📈 Historical Progression and Evolution of Quantum Algorithms
The video script delves into the history of quantum algorithms, starting from Feynman's proposal in 1981 and progressing through various stages of development. It highlights the transition from theoretical exploration to practical applications, such as Shor's algorithm and Grover's algorithm, which demonstrated quantum computing's potential to solve real-world problems. The script also discusses the emergence of variational quantum algorithms like VQE and QAOA in 2014, which gained prominence with the advent of cloud-based quantum computing platforms. These algorithms are designed to work within the limitations of NISQ-era hardware, focusing on shorter circuit depths and hybrid classical-quantum computation. The summary also touches on the exponential increase in qubit lifetimes, reflecting the ongoing improvements in quantum hardware and the potential for more complex quantum computations.
🛠️ The Relevance of Variational Quantum Algorithms in NISQ
The final paragraph emphasizes the importance of variational quantum algorithms in the context of NISQ-era hardware. It discusses the constraints imposed by the current state of quantum technology, such as the limited lifetime of qubits and the prevalence of noise, which necessitate the development of algorithms that are compact and can be executed within a short time frame. The script introduces the concept of quantum volume, a measure of the power of quantum computers, and provides examples of current IBM quantum machines with their respective specifications. It also highlights the average error rates and the average lifetime of qubits, which are critical factors in determining the feasibility of quantum algorithms. The paragraph concludes by underscoring the significance of variational algorithms in achieving quantum advantage, given the current limitations of quantum hardware, and sets the stage for further exploration of these algorithms and their applications.
Mindmap
Keywords
💡Quantum Algorithms
💡Variational Algorithms
💡Noisy Intermediate-Scale Quantum (NISQ)
💡Quantum Volume
💡Qubit
💡Quantum Advantage
💡Error Correction
💡Hybrid Algorithms
💡Quantum Machine Learning
💡Lifetime
Highlights
Introduction to modern quantum algorithms and variational algorithms by Shesha Raghunathan.
Shesha Raghunathan's background in computer architecture and role at IBM Systems.
Three generations of quantum computing: experimental, noisy intermediate-scale quantum (NISQ), and fault-tolerant.
IBM's role in mainstreaming quantum computing by putting quantum machines on cloud in 2016.
The concept of Noisy Intermediate-Scale Quantum (NISQ) era introduced by John Preskill in 2007.
Challenges in programming quantum hardware in the NISQ era due to noise and limited qubit numbers.
Historical development of quantum algorithms from theoretical to practical applications.
Emergence of variational algorithms like VQE and QAOA in 2014, aligning with hardware limitations.
Quantum volume as a measure of quantum computer's power, with IBM machines having a volume of 128.
The importance of qubit lifetime in quantum computing, with current state-of-the-art around 100-120 microseconds.
Error rates in quantum computing, including CNOT and readout errors, and their impact on algorithm design.
Variational quantum algorithms as a hybrid approach fitting current hardware limitations.
Potential applications of variational quantum algorithms in quantum machine learning and optimization.
The necessity for algorithms to solve problems that classical hardware finds hard within the quantum hardware's time budget.
The trend of increasing qubit lifetimes, moving towards milliseconds, and its implications for quantum computing.
The structure of variational algorithms, combining classical optimization with quantum computation.
The future of variational quantum algorithms in the NISQ era and their significance in quantum computing advancements.
Transcripts
[Music]
hi
welcome um in this week we're going to
talk about more modern algorithms
niskira quantum algorithms we're going
to learn about what variational
algorithms are and its potential
applications
my name is shesha raghunathan i'm part
of ibm systems
i work with electronic design automation
team particularly on timing analysis i'm
also an ibm quantum distinguished
ambassador and kisket advocate and a
technical ambassador
i lead the ambassador program in india
south asia
my phd was in computer corner computing
in from university of southern
california in 2010 and i've been with
ibm since 2011. so there are uh
many aspects to the modern quantum
algorithms so in this section what we're
going to learn is an introduction and a
motivation to
why we need these variation of quantum
algorithms how are they different from
the ones that we had before
but before we get started we want to
understand a little bit of the
generations of the broader scope
where we are where we have come from and
where we are heading um as such there
are three broad generations um
that
quantum computing um we can break
quantum computing into three broad uh
generations
um if we take 1981 as the starting point
when
richard feynman
contextualized
quantum computing in the more modern
form factor
uh since
2019 1981 to about 2016
is where we could broadly call it as
experimental why 2016 it's because uh
2016 is when ibm put its quantum machine
on cloud for access
and also a programming platform to
program that hardware
what that did is that people then
started to start accessing these
machines and to start programming it
before then it was mostly in the
experimental mode where it was in some
basement of physics lab or in some
companies
working out of
some
some corner of particular
research labs
but as such the mainstreaming of this
particular quantum computing as a
platform and that too on cloud
revolutionized many things
since then there has been a
transformation in terms of the
interest in programming this hardware
before then programming was less
important and and since 2016 it has
taken a new life of its own
john prescott
in 2007
coined the term noisy intermediate state
quarter
or in short nisk
what that was was that
the question was um currently the
hardware is still evolving the number of
qubits is noise the qubits is noisy and
the number of them is also small
um so he was
looking at qubit systems that are like
50 to 100 in range and they are noisy
the question was in this space can we do
something useful that provides value and
can we demonstrate something of an
advantage
this era is what he called as a noisy
intermediate state quantum
so
what is
the default then
so all the algorithms that was developed
prior to this are many of them
traditional that many of you are aware
of like shores uh bernstein was irony or
any such
doesn't deal with noise per se
so they look at a qubit that is clean
and then they start programming it
even in the 90s a lot of work happened
in terms of
error correction and fault tolerance and
so the concept or the worldview of
programming was that the qubit is clean
we just need to do the algorithm
but now in the niskira qubit is assumed
to be noisy then how do we program it so
that we get maximum out of this hardware
although preschool
contextualized the nisk in the space of
50 to 100 cubits
but we're going to cross that pretty
rapidly pretty soon so
technically it doesn't hold it but now
it has become more a label uh to capture
the space where we are in this noisy
regime and some point later in the
future we're going to transition to what
is eventually a fault tolerant regime
where the qubits can be considered clean
and there is an auto correction or
management of the noise more inherently
done within the hardware and the
software stack and so the programming
part of it the algorithm need not worry
about that but that is little further
out
in the meantime we are banging
right in the niskira and what is it that
we need to do to manage our algorithms
is the question
just a brief history i touched on some
of these things
in the prior chart
so if you look at the history a lot of
investigations in quantum algorithms
started since
feynman's proposal in 1981
initially there was questions regarding
what is the idea
of quantum does it provide any value a
lot of
problems toy problems were created to
demonstrate complexity angle for it does
it provide any additional value over and
above our classical
it went through into the 90s like deuce
george and bernstein was irani are
somewhat of artificial problems
trying to solve some problem but that
brings out a key factor in quantum to
demonstrate its key value proposition
and then we got into some more serious
work like shores and growers that were
solving real problems a lot of
algorithms have been developed since by
no means this is exhaustive but
important thing happened in 2014 a
couple of algorithms came vqe we're
going to learn about that in detail
in this lecture and also qaoa pqa stands
for variational quantum eigensolver qaoa
is a quantum approximate optimization
algorithm
so these are what are called as the
variational algorithms and that started
the era of variational algorithms
while these algorithms were proposed in
2014 it was really in 2016 when ibm put
out its machine on cloud that it really
took a life of its own
then a lot of people started programming
because these are in tune with the
hardware limitations of the time and so
these algorithms were focused on
shorter depth you don't want to have a
deep circuit
that is not simulatable that is not
computable with the quantum hardware of
the current time in the niskara you
would want it more compact and you want
it in a way where
these are
hybrid algorithms as well that works
well in the cloud platform and also in
the kind of hardware regime that we are
in
um while the algorithms um and different
derivatives of these algorithms came
about a lot of applications are also
looked at
real world problems that then took these
algorithms and mapped onto some real
world applications i'm going to comment
on some of that in the later some of it
is listed quantum machine learning
pricing option com battery optimization
and so on so forth
but really
what is the hardware part so it's
important to understand
the background to all of this
this chart here shows the
lifetime evolution
of superconducting qubits
by lifetime what is meant by that is
that if you encode some quantum property
into a qubit how long can it sustain be
after which noise starts dominating the
system
uh on the x-axis is time starting from
about 2000 all the way till 2020
and on the y-axis is lifetime in
microseconds
and so
what you are seeing is the scale
starting from
nanosecond at the bottom so if you saw
if you see cooper pair box uh in around
uh
year 2000 they were hovering in the few
nanoseconds that is a quantum property
was visible was available only for few
nanoseconds
that was not very interesting but it was
an important
movement in the experimental side in
proving certain properties what the
specific implementations are are not
relevant but what is important to notice
is the trend
so if you see
the trend
it is been increasing linearly in this
plot however note that the y axis is in
log scale which means that the lifetime
has been increasingly exponentially in
the last 15 20 years and more so in the
last five to eight years it has been
much more rapid lot of activity has
happened because it's gotten more and
more real
notice that we are in hovering around
100 microseconds ish
somewhere here in the regime that we are
in
i will show some actual data from some
of the hardware and we are moving
further and further up the trend seems
to be that we are increasing the
lifetime and this is important to
understand because lifetime while we are
increasing exponentially it is still
limited so 100 microsecond for example
is the time that we have to do all our
quantum compute before noise takes over
so which means that all the calculations
that we have to do the algorithm needs
to be structured and be more compact and
more we
should be more shallow so that we fix or
compute all the things needed within
that fixed time budget that we have
recently um jakam better put out a tweet
indicating that
now we
have entered a regime of milliseconds so
it's an order of magnitude up so we are
moving from microsecond to a millisecond
second regime
um so
we are now having uh qubits that can be
in the order of milliseconds and more um
these are experimental at this point in
time soon hopefully we will have this
productized available in cloud but so
clearly we are seeing a trend that is
growing in a longer duration where
quantum properties can exist so that
means that we can now can do more and
more computation using this hardware and
the trend seems to be exponentially
increasing
so then
why
uh
quantum variational algorithms so this
is important to notice so these are
snapshots from actual hardware that we
have on cloud these are little bit of on
the higher end in terms of its fidelity
at this point this is as of 10th july
2021
so if you go to this particular link
where we have all the machines listed
you can go click on it and you will get
a picture something like this i've
chosen some subset of them uh ibmq
mumbai is the name of a particular
machine kolkata is another one montreal
is another one please notice that the
quantum volume of these are 128 which is
at this point in time on the higher side
and the important part that i want to
highlight is in the orange box here
what this
calibration data shows is the average
error so we can see that the c naught
error hovers around
10 power minus 2
roughly
readout error is little worse it's about
10 power minus 1
and then
average lifetime in this particular case
in mumbai uh is about 120 microseconds
so the c naught error is
roughly um
0.1 percent and readout error is
hovering around one percent error
and if you look at kolkata it is similar
to the mumbai machine
the
c naught and the average readout
average readout in kolkata seems to be
better but the average c naught error
hovers are on the same ballpark as what
we see in mumbai
montreal is little older
and you can see that
the c naught error is worse and also
the um the readout error is also worse
compared to the other machines
as you can see the lifetime is also
about 100 microseconds if you take
the mean of these two numbers
so you can see that
the current state of the art as we see
it hovers around 100 120 microsecond
lifetime c naught error is
readout error is dominating that is the
measurement error around one percentage
c naught error is about an order of
magnitude less but still is dominant as
the number of c naughts increase you
will have more and more errors in the
system and finally the measurement which
is the readout error causes lot of
noise in the system as well so what this
means is that this is putting constraint
on the algorithms that we have to come
up with which means that we have to have
shorter circuits to perform the
algorithm and we should be able to solve
a problem
of consequence which means that the
classical hardware should find it hard
to solve so you would want to solve
something that only a quantum
system can solve in a compact fashion
that the classical hardware cannot solve
only then we will have something of a
quantum advantage
using these platforms so the variation
algorithms forms into this falls into
this nice form factor that fits into
this limitations of the current hardware
which means that the variation
algorithms are structured in a way um
that it's a hybrid algorithm we're going
to see that in detail shortly
where you have a classical component
that is you're going to run aspect of
optimization in the classical hardware
but the key element of some of the
harder part of
the optimization for example calculating
the energy value or an expectation of
the lowest energy for example those are
com computed in quantum and then the
tuning part happens in classical so you
will see that in detail this is how the
variational structure comes in and that
sort of fits into the current
limitations of the hardware that is out
there
that is the reason why variational
algorithms or variational quantum
algorithms have gained a lot of traction
and particularly in the niskira
this is here to say
stay and more and more variations or
derivatives of these algorithms are
coming about and many applications
are being looked at using these
techniques so it's important to
understand what these are how it is done
what are the concept behind it and what
are the challenges therein and what are
the potential applications this will be
the topic of the next section
関連動画をさらに表示
mod04lec21 - Variational Quantum Eigensolver
Quantum Computers Aren’t What You Think — They’re Cooler | Hartmut Neven | TED
Quantum Computing in Marathi | Edu. Talk with Prof. Milind Pande
mod04lec22 - Quantum Generative Adversarial Networks (QGANs)
The Map of Quantum Computing - Quantum Computing Explained
Apa itu Komputer Kuantum Sebenarnya?
5.0 / 5 (0 votes)