36C3 Preroll music
Herald: So hello and welcome to a quantum
computing talk by the Andreas [Dewes], who
gave a talk exactly five years ago and
it's almost exactly five years ago, it's
like one year and two or three hours. And
then he gave a talk at 31C3 about quantum
computing titled "Let's Build a Quantum
Computer". And I think back then we
basically had just found out that Google
was planning to partner with the
University of California at Santa Barbara
to try to build a quantum computer. Of
course, now we're five years later, we've
had a lot of developments, I think, in the
field. We've had some big announcements by
Google and other groups. And Andreas has
now come back to give us an update. So
please welcome him to the stage.
Applause
Andreas: Okay. Hi, everyone, so I'm very
happy to be here again. After five years
of giving the first version of this talk.
My motivation for given this talk is quite
simple. I was often so I did my PHD on
exponential quantum computing from 2009 to
2012. I left that field afterwards to work
in industry, but always people would come
to me and ask, hey, Andreas, did you see
like this new experiment. Did you see, you
can like use quantum computers on Amazon's
cloud now? Did you see, like Google has
this new quantum thing? This is really
working? Can we use quantum computers yet?
Why are you not working on this? And I
couldn't I couldn't really answer the
question. So I said, OK, I want to go back
to this and find out what happened in the
last five years since I finished my PhD.
What kind of progress was made in the
field? And do we actually have quantum
computers today that are working already
or are we not yet quite just there? So we
want to do it like this. I want to first
give you a short introduction to quantum
computing. So just that we have a common
understanding of how that works and why
it's interesting. Then I will show you a
small example of experimental quantum
speed up. Notably the work I did with my
colleagues in Saclay during my PhD
thesis. Then we discuss some of the
challenges and problems, why we were not
able to build a real quantum computer back
then. And I will discuss some approaches
that have come up since then. That would
basically allow us to do that eventually.
And then we'll, of course, discuss
Google's recent experiment in
collaboration with the University of Santa
Barbara, where they showed basically a
very impressive quantum computing system
with 53 Qubits. We will look exactly to
try to understand what they did there and
see if that's really like a quantum
computer in the in the real sense already
or if there's still something missing. And
in the end, of course, I will try to give
you another small outlook to see what we
can expect in the coming years. So in
order to talk about quantum computing, we
need to first talk about classical
computing just a little bit. You might
know that classical computers, they work
with bits, so zeros and ones. They store
them in so-called registers. This here for
example of like a bit register. Of course,
the bits themselves are not very
interesting. But we have to do stuff with
them so we can compute functions over
those bit registers. That's what like
modern CPU is doing in a simplified way,
of course. So we take some input, but
register values, we compute some function
over then and then we get an output value.
So a very simple example would be a search
problem. I would discuss this because
later we will also see in the experiment
how we can use a quantum computer to solve
this. So I just want to motivate why this
kind of problem can be interesting. And
it's a very silly search function. So it
takes two bits as inputs and it returns
one bit as an output, indicating, whether
the input bits are the solution to our
search problem or not. And you could
imagine that we have a very, very
complicated function here. So, for
example, a function that calculates the
answer to life, the universe and
everything, while not a complete answer,
but only the first two bits. So really
complicated to implement and very costly
to execute. So we might think that it
might take like millions of years to run
this function once on our inputs. And so
we want to find the right solution to that
function with as few function calls as
possible, of course. Overall, there are
four possibilities, so for input states, 00
01 10 and 11 that we can apply our
function to and only for one of these
states. The 01 state, because the answer
is 42. So that's 0 times 1 plus to plan 2
plus some other stuff. So the first two
bits are 0 1 for this for a value, the
function returns a 1 for all of the other
values, the function returns 0. Now let's
think about how we can implement a central
search function and in principle, if we
don't know anything about the function. So
we can imagine it's so complicated that we
can't do any optimizations. We don't know
where to look. So we have to really try
each of these values in sequence. And for
this we can have a simple algorithm so we
can start initializing out our a bit
register with 00 value. Then we can call
the function on that register. We can see
what the result is. In this case, the
result would be zero. If the result would
be 1, then we know, okay, we have found
our solution so we can stop our algorithm.
But in this case, the result is zero. So
we can just go back to the left value and
to the left step and increase the register
value, go to 0 1 and try again. And in the
worst case, depending if you're optimistic
or not, we have to do this three or four
times. So if you want to really be sure
that we find the right answers, we have to
do it four times in the worst case. And
this is sort of say the time complexity or
the computational complexity of the
search. You know, if you imagine that in
our algorithm, the most expensive
operation is really calling this function
F, then the calling time of the complexity
of calling this function will be what
dominates the complexity of our algorithm.
And in this case, the complexity is very
similar, simple here because it's linear
in the number of the search space. So if
you have n states, for example, in our
examples, we have four different input
spaced states. We also need to evaluate
the function four times. So and please
keep this graph in mind because we're
gonna revisit that later a bit to see,if
we can do better with a different paradigm
of computing. And so classically. This is
really the best we can do for the search
problem here because we don't know
anything else about the function that
would allow us to optimize that further.
But now the interesting thing is that we
might imagine that we don't use classical
computing for solving our problem. And in
fact, the discipline that we call quantum
computing was kind of like inspired by
lecturer or like a seminar of Richard
Feynman, who thought about, how it would be
possible to similar and or if it would be
possible to simulate quantum systems on a
classical computer. A Turing machine, if
you want. And he found that because
quantum mechanics is so complicated for
classical computers that it is not
possible to do that efficiently, but that
if you would use the laws of quantum
mechanics themselves to make a computer
like quantum computer, then it would be
possible to simulate this quantum systems
and just kind of like sparked this whole
idea of using quantum mechanics to do
computation. And in the following years,
they were not only as solutions found for
simulating quantum systems, which such a
quantum computer, but also for solving
other not related problems to quantum
computing. So like search problems or
factorization problems, for example. And
quantum computers can do, can do
computation faster than classical
computers, because they have several
differences in how they work. So one of
the key differences here is superposition,
which means that if you use a quantum
computer, instead of a classical computer,
we cannot only load a single register
value into our bit register. So for
example, the first value of only zeros.
But instead we can kind of load all of the
possible state values and it at once or in
parallel. And this so-called quantum state
or quantum superposition state where each
of these values here has an amplitude
which is shown on the left, that is
basically a complex number that relates
them to the other Qubit, to other states
and ups. If you have like for example,
n-Qubits, then the total number of Qubits
states can be very large 2 to the
power of N. So we can imagine that if you
have a large Qubit quantum quantum bit
register, then your number of quantum
states can be really, really large and
this can be very powerful for computation.
So in the rest of the talk, we gonna just
indicate this by like showing the register
as like a small rectangle to indicate that
it's not only a single value in there, but
that we have a superposition values of all
the possible input values to our function,
for example. And there is a condition and
so called normalization condition that
puts some constraints on these amplitude.
Because the sum of the squares of the
absolute values of these amplitude needs
to sum to one, which basically means that
the entire the probability of each of
these of all of these states together
needs to be 100 percent. So. And this is
the first ingredient that makes quantum
computers interesting for computation
because we can basically implement any
classical function that we can also run on
a classical computer, on a quantum
computer. The difference is that we cannot
only run it for one value at a time, but
we can call it can run it down on a
superposition of all possible input
values. So if you want, you have like this
massive paralellyzation where you run you
off computation on all possible inputs at
once and also calculate and all of the
possible output values. And that sounds,
of course, very cool and very useful.
There's a catch that we will discuss
later. So it's not as easy as that. But
this is one step off like the power that
makes quantum computing interesting. The
next thing that is different is that we
can on a quantum computer, not only run
classical gates or classical functions,
but we can also run so-called quantum
gates. And the quantum gates, they're
different in respect to the classical
functions because they cannot only like
classical operations like and or or on
like two Qubits in a predictable way. But
they can kind of like act on the whole
Qubit state at once and also create so-
called entangled states which are really
weird quantum states where we can't really
separate the state of one Qubit from the
state of or other Qubits. So it's kind of
like if we want to try to make a small
change to one of two Qubits in our
system, we also changing other Qubits
there. So we can never like separate the
bits, the Qubits out like we can with a
classical computer. And this is another
resource that we can use in quantum
computing to solve certain problems faster
than we could with a classical computer.
Now, the catch, as I said, is that we, of
course, do not, we do not want to only
make computation with our Qubits, Qubits
register, but we also want to read out the
result of our computation. And if we try
that. So we make like computation. And
when we want to measure the state of our
quantum register, we have a small problem
because, well, the measurement process is
actually quite complicated. But in a very
simplified way, you can just imagine, that
God is trying some dice here. And then if
we have a quantum vector, a quantum state
vector that has like this amplitude on the
left. So a one to a n. And then we will
pick. He or she would pick a state
randomly from the possible states. And the
probability of getting a given state as a
result is proportional, as is that before
to the square of the absolute value of the
amplitude. So that means we can perform
computation on all of the possible input
states of our function. But when we read
out the result, we will only get one of
the possible results. So it's kind of like
destroys at the first glimpse the utility
of quantum computing because we can do
like computation on all states in
parallel, but we cannot read out the
result. So not a very interesting computer
because we can't learn about the output.
So to say or not easily at least. But it
turns out that there's actually a way of
still using quantum computing to be faster
than a classical computer. And the first
kind of practical algorithm for a search
problem, notably the search problem that
we discussed before, was given by Love
Grover, who was a researcher at the Bell
Labs, and who found the Grover algorithm
that is named after him. That's basically
a search algorithm which can prove it can,
as we will see, solved the search problem
that we have in a much more efficient way
than any classical computer could. And in
my opinion, it's still one of the most
beautiful quantum algorithms because it's
very simple and it's very powerful and
does also prove, unlike for other
algorithms like the factorization
algorithms from Shor that the Grover
algorithm can be will be faster always
than any classical computer classical
algorithm. So in my opinion, it's a very
nice example of really a quantum algorithm
that is more powerful than a classical
one. Let's see how it works. So they're
three steps again and the algorithm. First
we initialize our Qubit register, our
state vector to a superposition of the
four possible output values, so 00 01
10 and 10, again, all with equal
probability in this case, zero amplitude.
Then we evaluate the function on this
input state here and what the function
then does. So we made some special
encoding here that basically marks the
solution of our problem by changing the
sign of the amplitude of the corresponding
state. We can see that in the output state
here, the 01 state has a sign which is
negative, which means that it's the
solution of the problem that we search.
Still, if we were to the read out now
directly, we wouldn't be able to learn
anything about the solution, because as
you can see, the amplitude is still equal
for all of the four states. So if you
would make a read out now, we would only
get like one of the four possible states
at random so we wouldn't learn anything
with a hundred percent probability about
the solution of our problem. In order to
do that, we need to apply another step to
so-called Grover or Diffusion, diffusion
operator, which now takes this phase
difference or the sign difference between
these individual quantum states and
applies a quantum operator to that, that
basically transfers the amplitude from all
of the states that are not a solution to a
problem to the states that is the
solution. And for on this case, with two
Qubits here and with four possible values,
there's only one step we need. And after
executing that, you can see that now the
amplitude of our solution state is one
versus very. But the amplitude of the
other states is all zero. So that's great,
because now we can just do a Qubit
measurement and then we will have a
hundred percent probability find a
solution to our search problem. And that's
where kind of like the magic of quantum
mechanics shows, because you can evaluate
its function only once. So remember that
in the first step we only call the search
function once of all of the values in
parallel. So from the computational
complexity, we are much lower than the
classical algorithm, but still we are able
to find 100 percent position in this case
to see which state is the solution to our
search problem. So and that's working not
only for the case of two Qubits, but also
with larger Qubit registers. So for
example, if you would take 10 Qubits, you
would need to execute a few more of these
steps, two and three. So instead of one
iteration, you would need 25 iterations,
for example, here, which is still much
better than the 1024 iterations that you
would need if you would really look into
every possible solution of the function in
the classical algorithm. So the speed up
here is very good for, so to say, all of
the like. It's quadratical for the
solution space. And if you like, look at
the complexity plot again, we can now
compare our classical algorithm with the
quantum algorithm on the Grover search.
And as you can see, the time complexity
or the number of variations of F that we
need is only a square root of N, where N
is the size of the search space,
which shows that that we have really a
speed advantage hier of the quantum
computer versus the classical computer.
And nice thing is the larger our search
space becomes, the more dramatic our speed
up will be, because for example, for a
search space with one million
elements. We will only have to evaluate
the search function 1000 times instead of
one million times. So that's quite so to
say a speed up in that sense. Now, how can
we build a system that realizes this
quantum algorithm? Here, I show on the
quantum processor that I built with my
colleagues at the Saclay during my PhD. So
if you want more information about this,
you should check out my last talk. I just
want to go briefly over the different
aspects here. So we use a so called
superconducting Qubits, transmit Qubits
for realizing our quantum computer. A
quantum processor. You can see the chip
here on the top. It's about one centimeter
across. You can see the two Qubits in the
middle. The other, like snake like
structures are coupling a wave guides
where we can manipulate the Qubits using
microwaves. So we use frequencies that are
similar to the ones that are used by
mobile phones to manipulate and read out
our Qubits. And if you look in the
middle, you can see the red area, which
contains the Qubit, each Qubit itself. And
then there's another zoom in here, which
contains the actual qubit structure, which
is just some two layers of aluminum that
have been placed on top of each other and
which create, when they are cooled, to a
very low temperature, a so-called
superconducting state, where we can use
the superconducting face between these two
values, layers to indicate to to realize
our Qubits. There's also coupler in the
middle. So this green element that you
see, which allows us to run quantum gate
operations between the two Qubits. To use
that in practice, we need to put this in a
delusion crisis that which is really like
just a very fancy refrigerator, you could
say, you cool it down to about 10 milli K.
So very low temperature just above the
absolute zero temperature. You can see the
sample holder here on the left side with
the chip mounted to it. So this whole
thing is put in the delusion fridge and
it's cool down to the temperature. And
then we can, as I said, manipulated by
using his microwave transmission lines.
And what we did is we implemented the
Grover search for the two Qubits. So we
ran this algorithm that I discussed
before. I don't want to go to too much
into the details. The results are obtained
by running this algorithm many times. And
as you can see, we have achieved not a
hundred percent success probability, but
over 50 percent for the most cases, which
is like, yeah, not perfect, of course, but
it's good enough to, in our case, show
that there was really a quantum speedup
possible. And if you ask why, okay, why is
not 100 percent probability possible or
why can't we build larger systems with
data, what kept us from, for example,
building a 100 or 1000 qubit quantum
processor? Well, there's several things on
this, of course, that we have like we make
errors when we manipulate the Qubits. So
the microwave signals are not perfect, for
example. So we introduce small errors when
like making two Qubit and single Qubit
interactions. We also need a really high
degree of connectivity if we want to build
a large scale quantum computer. So if
every Qubit is connected to every other
Qubit, for example, that would make one
million connections for 1000 Qubit code
processors, which processor which is just
on the engineering side, very hard to
realize. And then also our Qubits has
errors because they can the environment
that the Qubits are in, like the chip and
the vicinity there also introduces noise
that will destroy our quantum state and
that limits how many operations we can
perform on a single Qubit. So is possible
solution, which is the surface code
architecture which was introduced in 2009
already actually by David DiVincenzo from
the Jülich Research Center. And the idea
here is that we do not have a quantum
process of a full connectivity. So we do
not connect every Qubit to every other
Qubit. Instead, we only connect a Qubit to
its four neighbors via so-called tunable
coupler. And this is, of course, much
easier because you don't need so many
connections on a chip. But it turns out
that you can still run most of the quantum
algorithms that you could also run with a
fully connected processor. You just have
to pay like a penalty for the limited
connectivity. And the nice thing is also
that you can encode a single logical
Qubit. So Qubit that we want to do
calculations with as for example, five
physical Qubits. And so all of these
Qubits here that are on the chip would
together form one logical Qubit, which
would then allow us to do error
corrections so we can, if there had been
some error of one of the Qubits, for
example, of relaxation or a defacing
error, then we can use the other Qubits
that we prepared in exactly the same of
same way to correct this error and
continue doing the calculations. And this
is quite important because in these
superconducting Qubit systems, there are
always error present errors present, and
we will not probably be able to eliminate
all of them. So we need to find a way to
correct the errors, while we perform the
computation. Now the Google processor
follows the surface code approach, here I
show you an image from the Nature article
which was released, I think, one months
ago. So it's a very impressive system, I
find, it contains 50 trees superconducting
Qubits, 86 couplers, tunable couplers
between those Qubits and they achieve
fidelity. So the success probability, if
you like, for performing one and two Qubit
gates, which is higher than 99 percent. So
this is already pretty, very, very good.
And almost enough fidelity to realize
quantum error correction as I discussed
before. And with the system, you can
really run quite complex quantum
algorithms, much more complex than the
ones that we run in 2012. So the paper,
for example, they run sequences with 10 to
20 individual quantum operations or
Quantum Gates. And just to give you an
impression of the crisis study, a
cryogenic engineering and microwave
engineering here, this is so to say, the
delusion crisis that where the Qubit ship
is mounted and you can see that it's quite
a bit more complex than the system we had
in 2012. So it really looks way more like
a professional quantum computer, I would
say. If you ask a physicist now, why would
you build such a system? The answer would
be, of course. Well, it's awesome. So why
not? But it turns out that if an
organization like Google gives like 100 or
200 million US dollars for realizing such
research, they also want to see some
results. So that's why the team, of
course, under John Martinez tried to use
this quantum process for something, that
shows how powerful or dead, so to say, can
outperform a classical computer. And this
sounds easy, but actually it's not so not
so easy to find a problem that is both
doable on this quantum computer, which has
like 50 Qubits and a bit more than 50
Qubits and like 80 couplers. But it's not
possible to simulate on a classical
computer. So we could think, for example,
about the factoring of numbers into prime
components, which is, of course, always
like the motivation of certain agencies to
push for quantum computing, because that
would allow them to read everyone's email.
But unfortunately, in both, the number of
Qubits that he would require for this and
the number of operations is much too high
to be able to realize something like this
on this processor. The next thing, which
would be very interesting is the
simulation of quantum systems. So if you
have like molecules or other quantum
systems that have many degrees of freedom,
it's very difficult to simulate those on
classical computers. On a quantum computer
you could do it efficiently. But again,
since the Google team did not do this, I
assume the quantum computer was just or
they didn't have like a feasible problem
where they could actually perform such a
simulation that would not be not be
performing well or like calculable on a
classical computer. So but in the near-
term, in the future, this might actually
be very relevant application of such a
processor. The last possibility would be
to run, for example, the search algorithm
that we discussed before. But again, for
the number of Qubits that are in the
system and the size of the search space,
it's still not possible because the
algorithm requires too many steps. And the
limited coherence times of Qubits in this
processor make it impossible to to
run this kind of like algorithm there, at
least to my knowledge. So what what they
did then, was therefore to perform a
different kind of experiment, one that was
doable with the processor, which is so-
called randomized benchmarking. And in
this case, what you do is that you instead
of like running an algorithm that does
something actually useful, like a search
algorithm, you run just a random sequence
of gates. So you have, for example, your
53 Qubits and then you run first like some
single Qubit gates. So you changed the
Qubit values individually. Then you run
two Qubit gates between random Qubits to
create like a superposition and an
entangled state. And in the end, it just
read out the resulting qubit state from
your register. And this is also very
complex operation. So you really need a
very high degree of like control of your
quantum processor, which the Martinez is,
the Google team was able to achieve here.
It's not it's just not solving a really
practical problem yet, so to say. But on
the other hand, it's the it's it's the
system. It's an algorithm that can be run
on the quantum computer easily, but which
is, as we will see, very difficult to
simulate or reproduce on a classical
system. And the reason that it's so
difficult to reproduce on a classical
system is that, if you want to simulate
the action of these quantum gates that we
run on the quantum computer using a
classical machine, a classical computer,
then for every Qubit that we add, roughly
the size of our problem, space quadruples.
So you can imagine if you have like two
Qubits, then it's very easy to simulate
that you can do it on like your iPhone or
like your computer, for example. If you
add more and more Qubits store, you can
see that the problem size becomes really
really big really fast. So if you have
like 20 Qubits, 30 Qubits, for example,
you cannot do it on a personal computer
anymore. You will need like supercomputer.
And then if you keep increasing the number
of Qubits, then at some point in this
case, 50 Qubits or 53 Qubits, it would be
impossible even for the fastest
supercomputers that we have right now. And
that's what is called the so-called
quantum supremacy regime here for this
randomized gate sequences, which is
basically just the area here on the curve.
That is C, that is still doable for this
quantum processor that Google realized.
But it's not simulatorable or verifiable
by any classical computer, even like a
supercomputer in a reasonable amount of
time. And if we can run something in this
regime here, it proves that we have a
quantum system that is able to do
computation, which is not classically
reproducible. So it's something that
really can only be done on a quantum
computer. And that's why running this kind
of experiment is is interesting, because
it really shows us that quantum computers
can do things that classical computers
cannot do, even if there are for the
moment not really useful. And the gate
sequence that they run looks something
like this. We can see again, like here
five, four of the Qubits that the Google
team has. And they run sequences of
operations of different lengths, then
perform a measurement and then just sample
the output of their measurements. So what
they get as a result is a sequence of long
bit strings, so zeros and ones. For each
experiment, they run and to reproduce the,
to check that a quantum computer is
actually doing the right thing, you have
to compare it to the results of a
classical simulation of this algorithm.
And that's, of course, a problem now,
because, we just said that we realized the
quantum computer, a quantum processor,
which is able to do this computation on 53
Qubits and that no classical computer can
verify that. So the question is now, how
can they prove or show that what the
quantum computer calculates is actually
the correct answer or that he does not
just produce some garbage values? And
that's a very interesting question,
actually. And the way they did it here is
by extrapolation. So instead of, for
example, solving the full circuits, so
that contains all of the connections and
all of the gates of the full algorithm,
they created simplified circuits in two
different ways. So, for example, they cut
they cut some of the connections between
the Qubits and the algorithms, so that the
problem space would become a bit smaller
or in the other case, with the allied
circuit, they just changed the operations
in order to allow for some shortcuts in
the classical computation of the classical
simulation of the algorithm. So in both
cases, they were able to then verify the
result of the quantum computation with this
classical simulation performed on a
supercomputer. And then they basically
just did this for a larger and larger
number of Qubits. They plotted the
resulting curve and they extrapolated that
to the supremacy regime to see that. OK.
Based on the error models that they
developed, based on the simulation, they
can with a certain confidence, of course,
say that probably the quantum computer is
doing the right thing even in the
supremacy regime, even though we can't
they cannot verify it using the classical
simulations. And in case the quantum
computer did wrong still, they have also
archive to the results. And maybe ten
years when we have better supercomputers,
we might be able to just go back to them
and then verify them against the 53, 53
Qubits processor here, by which time, of
course, they might already have like a
larger quantum processor again. So the key
results of this, I would say, are that for
the first time they show that really
quantum computer can beat a classical
computer, even though it is at a very
artificial and probably not very useful
problem. And what the experiment also
shows is that really, I would say an
astounding level of control of such a
large scale on medium size quantum
processor, because even five years ago,
six years ago, 2012, 2013, the systems
that we worked with mostly consisted of
three or four Qubits and we could barely
fabricate the chips and manipulate them to
get like algorithms running. And now if I
see like a 50 tree Qubit processor with
such a high degree of control and fidelity
there, I can really say that is really an
amazing progress in the last five years
that what was achieved, especially by the
Google Martinez team here. And I think it
is a very good wild milestone on the way
to fully work on quantum computer because
it nicely shows the limitations of the
current system and gives a good direction
on new areas of research, for example, an
error correction, where we can improve the
different aspects of the quantum
processor. The research has also been
criticized from various sides, so I just
want to iterate a few of the points
here. One of the criticisms is, of course,
that it doesn't do anything useful. So
there's really no applicability of this
experiment and why that's true. It's, of
course, very difficult to go from like a
basic, very simple quantum process of two
Qubits to a system that can really
factorize prime numbers or do anything
useful. So we will always need to find
problems that are both hard enough so that
we can solve them in a reasonable
timeframe. A couple of years, for example,
that still proved the progress that we
make on the road to quantum computing. So
in this sense, while quantum supremacy
does not really show anything useful in
terms of computation that is done. I think
it is still a very good problem as a
benchmark for any kind of quantum
processor, because it requires that you
have very good control over your system
and that you can run such a number of
gates at a very high fidelity, which is
really currently, I would say, the state
of the art. The research also took, took
some shortcuts. For example, they used
like a two Qubits, quantum gates, which
are not, as we call them, canonical
gates, which might be problematic
because if you want to run a quantum
algorithm on the system, you need
to implement certain quantum gates that
you need for that. And since they only
have like non canonical gates here, which
are still universal, by the way, they
could not do that directly, but with some
modification of the system, that should
also be possible. And the last criticism
might be that, okay, here you have a
problem that was engineered to match a
solution, which is of course that, okay,
we need some solution, as I said, some
problem that we can't realistically solve
on a such a system. I think, though, also
like the other points, if you want to
build a large scale quantum processor, you
need to define reasonable milestones and
having such a benchmark that other groups,
for example, can also run that process
against is a very good thing because it
makes the progress visible and also makes
it easy to compare how different groups
or are companies or organizations
are are at competing on the
number of Qubits under control they have
about them. So, if you want to make a more
kind of Moore's Law for quantum computing,
there would be several possibilities that
you could do. Here I show you, for
example, the number of Qubits that have
been realized for superconducting systems
over the years. This is, of course
incomplete because it could like the
number of Qubits alone doesn't tell you
much about your system. I mean, we could
do a Qubit chip of 1000 or 10000 Qubits
today. But if you don't have the
connectivity and don't have to controllability
of individual Qubits, then this
chip wouldn't be good. So there are other
things, that we also need to take into
account here. As I said, just as like the
coupling between individual Qubits and the
coherence time and the fidelity of the
Qubit operations. So this is really just
one one very small aspect of this whole
whole problem space. But I think it shows
nicely that in the last years there was
really tremendous progress in terms of the
power of the superconducting systems,
because the original Qubit, which was
developed in at NYC in Japan by
Professor Nakamura, was done in like
around 2000. So that very, very bad
coherence time, very bad properties. But
still it showed for the first time that he
could coherently control such a system.
And then it didn't take long for other
groups, for example, to Quantronics
Group and so Saclay, to pick up on this
work and to do to keep improving it. So
after a few years, we already had Qubits
of a few hundred or even a microsecond of
coherence time, which was like in like
three or orders of magnitude better than
what we had before. And there were other
advances then made by groups in the US,
for example, in Yale, the ShowCoupLab,
which developed new Qubit architectures
that allowed us to couple the Qubits more
efficiently with each other and to again
have better control of them manipulating
them. And then there's also groups like
the research group at IBM or companies
like WeGetty that took again these
ideas and that added engineering and their
own research on top of that in order to
make the systems even better. So in 2018,
we already had systems with 17 or 18
Qubits in them. And now with this Google
and UC Santa Barbara work, we have the
first systems with more than 50 Qubits
after not even 20 years, which I think is
quite some progress in this area. And of
course, if you ask me how close we are to
and actually working quantum computer,
it's still very difficult to say, I find,
because we've proven the group prove the
quantum supremacy for its randomized
algorithm. But in order to do something
applicable or useful with such a quantum
system, I think we need like at least
again, 50 maybe 200 additional Qubits and
a larger number of Qubit operations. But
it's really hard to say. That's why I also
say don't believe in this chart because
there's also, of course, a lot of work in
the theory of quantum algorithms, because
up to now we are still discovering new
approaches of doing quantum simulations
for examples. And right now, there are a
lot of research groups that are looking
for ways to make these medium scale
quantum computers. So quantum computers
with 50 or 100 Qubits already useful for
using quantum simulations. So it's really
an interplay between what the theory can
give us in terms of quantum algorithm and
what in terms of experimental realization
we can build as a quantum processor. So in
my opinion, quantum simulation will
definitely be something that where we will
see the first applications in the next. I
would say three to five years. Other
things, optimizations. I have to admit I
am less an expert and I think they're a
bit more complex. So we will probably see
the first applications in those areas a
bit later. And the big motivation for like
the three letter agencies always is,
of course, the factoring out the breaking
of cryptosystems, which is the most
challenging one, though, because in order
to do that, you would both need very large
numbers of Qubits. So at least 8000
Qubits for an 8000 bits RSA key, for
example. And you would also need a very
large amount of Qubit operations because
you need to run the sure operation. And
that involves a lot of steps for the
quantum processor. And so to say the most,
I would say from my perspective
unrealistic application of superconducting
quantum processes in the next year. But I
think, if somebody would build a quantum
computer, maybe we would also not just
know about it. So who knows? So to
summarize, quantum computers, quantum
processors are getting really, seriously
complex and very impressive. So we have
seen tremendous progress in the last five
years. I still think that we are like five
years away from building really practical
quantum computers and there are still some
challenges. For example, an error
correction in the Quantum Gatefidelity and
indeed, again, general architecture of
these systems that we need to overcome.
And they might also be some challenges
which we haven't even identified yet with
which we might only encounter at a later
stage when trying to build really large
scale quantum processors. And as a last
point, I just want to stress again, that
quantum computing research is not only
done by Google or by IBM, there a lot
of groups in the world involved in this
kind of research, both in theory and an
experiment. And as I said before, a lot of
the breakthroughs that we use today for
building quantum processes were done in
very different places like Japan, Europe,
USA. So it's really, I would say, a global
effort. And you should also, when you
look, when you see this marketing PR that
companies like Google and IBM do, maybe
not believe all of the hype they're
creating and keep on down to earth views,
so to say, of the limits and the potential
of quantum computing. So that's it. And I
would be happy to take on your questions
now. And if you have any
feedback, there's also my Twitter handle
and my email address. And I think we also
have some time for questions here right
now. Thank you.
Applause
Herald: Thank you, Andreas. We have almost
20 minutes for Q and A. If you're leaving
now, please do so very quietly and if you
can avoid it, just don't do it. Thank you.
Okay. Q and A. You know the game. There's
eight microphones in this room, so just
queue behind them and we will do our best
to get everyone sorted out sequentially.
We will start with a question
from the Internet.
Signal-Angel: Thank you. Do you have
information about the energy consumption
of a quantum computer
over the calculation power?
Andreas: Yeah, that's an interesting
point. I mean, for superconducting quantum
computers, there are like several costs
associated. I think right now the biggest
cost is probably of keeping the system
cooled down. So as that you need very low
temperatures, 20 or 10 millikelvin. In
order to achieve that, you need the so-
called delusion crisis that and these
systems that consume a lot of energy and
also materials like helium mixtures, which
are expensive and like maybe not so well,
kind of like a real material right now. I
think that would be the biggest
consumption in terms of energy use. I
honestly don't have so much of an idea. I
mean, the manipulation of the Qubit system
is done via microwaves and the power that
goes into the system is very small
compared to any of the power that we use
for cooling the system. So I would say for
the foreseeable future, the power
consumption should be dominated by like
the cooling and the setup costs and the
cost of the electronics as well. So the
classical electronics that that controls
the Qubit, which can also be quite
extensive for large system. So the Qubit
chip itself should be very should be
really negligible in terms of energy
consumption.
Herald: Thank you. Microphone number one
please.
Mic 1: Hello. I have a question in regards
to quantum simulation. So I would have
thought that with 53 Qubits,
there would already be something
interesting to do, since I think their
border the limit for more or less exact
quantum chemistry calculations on
classical computers is that there are 10
to 20 particles. So is there a more
complicated relation from particles to
Qubits that's missing here or what's the
problem?
Andreas: Yeah. So in the paper I couldn't
find an exact reason why they choose this
problem. I think there are probably two
aspects. One is that you don't have in the
system the like arbitrary Qubit control.
So to say you cannot run like any
Hamiltonian or quantum algorithm that you
want. You are like limited in terms of
connectivity. So it's possible that they
were not able to run any quantum algorithm
for simulation, which was not easy to run
also on a classical system, you know, so.
But I'm really not not sure why they
didn't. I think just if they would have a
have had this chance to do a quantum
simulation, they would probably have done
that instead, because that's, of course,
more impressive than randomization or
randomized algorithms. So because they
didn't do it, I think it was just probably
too complicated or not possible to realize
on the system. Yeah. Okay. So it's this.
But again, I don't know for sure yet.
Thank you.
Herald: Yes, and also speaking as a sometimes
quantum chemist, you can't directly map
Qubits to to atoms. They're not two level
systems. And you don't I mean, you usually
also simulate electrons and not just
atoms, but I'm not a speaker. We can
discuss later. Maybe microphone number two
please.
Mic 2: Thanks. Can you compare this
classic or general quantum computer to the
one by D-wave? That's one of the quantum
computers by a AWS offered. They have two
thousand Qubits or something.
Andreas: Yeah, that's a very interesting
question. So D-wave system is this so-
called adiabatic quantum computer, to
my knowledge. So this in this case the
computation works a bit differently. It's
the normal with this quantum computer that
Google produced. You have a gate sequence
that you run on your input Qubits and then
you get a result that you read out. With
the D-wave system it's more that you like
engineer like in Hamiltonian, which is
also which also consists of local
interactions between different Qubits. And
then you slowly changed this Hamiltonian
in order to like change to the ground
state of the system to a solution of a
problem that you're looking for. So. So
it's a different approach to quantum
computation. They also claimed that they
can can achieve what I did, achieve a
quantum supremacy, I think in a different
way for like an optimization problem. But
to my knowledge, the proof they have is
less rigid probably than, what the Google
Group produced here. So but again, I'm not
like an expert on that, a bit of quantum
computing. So I'm more like a gate based
person. So, yeah, I think though, the
proof that here the Google show is more
convincing in terms of like reproduce
reproducibility and really make the proof
that you are actually doing something that
cannot be done on a classical computer.
D-Wave will see the different view though.
Herald: All right. Let's go to the back.
Number seven, please. Hello. 7. You just
waved to me.
Mic 7: Hey, uh, hello. Uh, I was reading
that earlier this year IBM released the
first commercial Q one system or whatever
the name is. And you were mentioning
before to keep our expectations down to
Earth. So my question is, what kind of
commercial expectations is IBM actually
creating?
Andreas: Mm hmm. So I spoke to some
companies here in Germany that are
collaborating with IBM or D-Wave or Google
as well. And to ask what they're actually
doing with the quantum computers. They are
the the companies offer. And I think the
answer is that right now, a lot of
commercially, a lot of companies are
investigating this as something that could
potentially be very useful or very
relevant in five to 10 years. So they want
to get some experience and they want to
start collaborating. I don't think, at
least I don't know any reproduction use of
these systems where the quantum computer
would do some calculations, that would not
be doable on a classical system. But
again, I don't have a full overview of
that. I think now it's mostly for
experimentation and forgetting to notice
systems. I think the companies or most of
the customers there probably expect that
in five years or 10 years, the system will
systems will really be powerful enough to
do some useful computations with them as
well.
Herald: Thanks. All right. The Internet,
please.
Signal-Angel: With a quantum computer, you
can calculate things in parallel. But
there is this usability requirement. So
how much faster is a quantum
computer at the end of the day?
Andreas: Mm hmm. Yeah, it's true, so that
if you want to and, if you want to realize
classical algorithm, you have to do it in
a reversible way. But to my knowledge, you
can from an efficiency perspective,
implement any classical non reversible
algorithm as a reversible algorithm
without loss in complexity. So you can
have also like for a reversible
computation, you have universal gaits like
the control not gate that you can use to
express any logic function that you
require. You might need some additional
Qubits in compared to the amount of the
classical bits that you need for the
computation. But in principle, there is
nothing that keeps you from implementing
any classical function on a quantum
computer. In terms of actual runtime, of
course it depends on how fast you can run
individual operations. Right now, a single
Qubits operation, for example, on this
Google machine takes about I think 20 to
40 nanoseconds. So in that sense, the
quantum computers are probably much slower
than classical computers. But the idea is
anyway that you do only really the
necessary computations that you can't do
on a classical machine, on a quantum
computer and anything else you can do on a
normal classical system. So the quantum
process in this sense is only like a like
inside a core processor, like a GPU, in
that sense, I would say.
Herald: All right. Microphone number four,
please.
Mic 4: On the slide that shows Richard
Feynman, you said that quantum computers
were invented to simulate quantum systems.
And can you please elaborate on that?
Herald: You went past, huh?
Andreas: Yeah. So I don't have to link to
the lecture here. Unfortunately, the link
is broken, but you can find that online.
It's a 1982 lecture from Feynman, where he
discusses like how you would actually go
about simulating a quantum system, because
as we have shown like the if you want to
simulate a full quantum system, you need to
simulate the density matrix of the system
and that takes about that take, it takes
an exponential amount of memory and
computation in terms of like the number of
Qubits or quantum degrees of freedom that
you want to simulate. And with a classical
Turing machine, you couldn't do that in an
efficient way because every time you add a
single Qubit, you basically quadruple your
computational requirement. And that's
really where the idea came from. I think
from Feynman to think about a computing
system that would use quantum mechanics in
order to be able to do these kind of
simulations, because he saw probably that
for large quantum systems it would never
be possible to run, at least with our
current understanding of classical
computing. It would never be possible to
run a quantum simulation of a quantum
system on a classical computer in an
efficient way. Does that answer the
question?
Mic 4: Yeah.
Andreas: Okay.
Herald: All right. Microphone eight,
please.
Mic 8: As a physicist who's now doing
analog circuit design. I'm kind of
wondering why all the presentations about
quantum computers always use stage zero
and 1 and not multiple states. Is that a
fundamental limitation or is that just
just a simplification for the sake of the
presentation?
Andreas: So you mean why you don't use
like higher Qubit states or like...
Mic 8: Multi valued logic or even
continuous states?
Andreas: So in principle, the quantum bits
that we're using, they don't they're not
really two level systems. So there is not
only level zero and one, but also level
two tree and so on. You could use them, of
course, but the computational power of the
system is given as the number of states,
or like m for example, race to the power
of the number of Qubits. So M to the power
of N. So in that sense, if you add like
another state, you only change like. Like
a small affected and adding another Qubit.
So it's usually not very interesting to
add more states. What he would do instead,
is just add more Qubits to your system.
And for continuous variable quantum
mechanic quantum computation. I think
there is some use cases where this might
outperform like the digital quantum
computers, especially if you can engineer
your system to like mimic the Hamiltonian
of the system that you want to simulate.
So I think in this sense, in these cases,
it makes a lot of sense. For other cases
where you say, OK, you want to run a
general quantum computation, then like
such a digital quantum computer is
probably the best solution. And you could
also just add that run like a continuous
simulation of a quantum system on such a
gate based quantum system, just like the
linearly in the same order of complexity,
I would say. Does that answer the
question?
Mic 8: I think I delude myself to have
understood that the non diagonal elements
in the density matrix grow much faster
than the number of states in any and any
diagonal matrix element.
Andreas: I guess you could say like that.
Yeah, I have to think about.
Herald: All right. Number three, please.
Mic 3: What do you have to say about the
scepticism of people like Nikolai that
claim that inherent nice will be a
fundamental problem in scaling this
quantum computers?
Andreas: I mean, it's a valid concern, I
think. As of today, we don't have even for
a single Qubit shown error correction.
There are some first experiments, for
example, by the ShowCoup Lab in Yale that
showed some of the elements of error
correction for a single Qubit system, but
we haven't even managed today to keep a
single Qubit alive indefinitely. So that's
why I would say it's an open question.
It's a valid criticism. I think the next
five years will show if we are actually
able to run this quantum errors and if our
error models themselves are correct
because they only correct for certain
errors or if there's anything else that
keeps us from like building a large scale
system. So I think it's a totally valid
point.
Herald: Microphone five, please.
Mic 5: There has been a study on
factarising on adiabatic machines, which
requires a lock squared N Qubits while
Shor requires Log N. But as the adiabatic
systems have much higher Qubit numbers,
they were able to factorize on these
machines, much larger numbers than on the
normal devices. And that's something that
never shows up in the discussion. Do you
want to comment on that? Have you read the
study? What do you think? Are adiabatic
machines, bogus? Or, is that worth while
resolved?
Andreas: I'm not. Yeah, as I said, like an
expert at adiabatic quantum
computing. I know that there were some
like studies or investigations of the
D-wave system. Like I haven't read this
particular study about factorization. I
think adiabatic quantum computing is a
valid approach as well to quantum
computing. I just I'm not just just not
sure if currently like the results were
like shown with the same amount of like
rigidity or like rigid proves like for the
gate based quantum computer. But yeah, I'm
I really would have to look at the study
to to see that.
Herald: Can you maybe quickly say the
authors. So it's on the record. Yeah. If
your mike is still on number five.
Mic 5: Sorry, I don't.
Herald: Okay, no problem. Thank you. All
right.
Andreas: But yeah, I don't think adiabatic
quantum computing is like and I
think adiabatic quantum computing is a
valid choice or valid approach for doing
quantum computation as well.
Mic 5: So I can give you that. I can
search for the authors later and give it
to you.
Andreas: Okay. Okay. It would be great.
Thank you.
Herald: Thank you. Microphone four,
please.
Mic 4: What do you say about IBM's claim
that Google's supremacy claim is invalid
because the problem was not really hard?
Andreas: Yeah. So basically IBM, I think
said, okay, if you do some optimizations
on the way you simulate the systems, then
you can reduce this computation time from
10000 years to like maybe a few hours or
so. I think it's, of course, valid. It
might be a valid claim. I don't know if it
really invalidates the result because as I
said, like the computational power of like
the classical systems, they will also will
also increase in the coming years. Right
now, you could say that maybe if we
haven't achieved quantum supremacy in
regards to elect 2019 hardware, then maybe
we should just like look at the 2015
hardware and then we can say, okay, there,
probably we achieved that. In any case, I
think the most what's most impressive
about this result for me is not like, if
we are really in the supremacy regime or
maybe not. That's really the amount of..,
the degree of controlability of the
Qubits system that this group has
achieved. I think that's really the
important point here, regardless of
whether they actually achieved the
supremacy or not. Because it shows that
these kind of systems seem to be a good
architecture choice for building large
scale quantum processes. And this alone is
very valuable, I think, as a guide to
future research direction, regardless of
whether this is actually, you know, they
achieved this or not. Yeah, but yeah, I
can understand, of course, the criticism.
Mic 4: OK. One thing. The article is
called Quantum Annealing for Prime
Factorization appeared in Nature in
December 18. Authors Jiang, A. Britt, Alex
J. McCaskey, S. Humble and Kais.
Andreas: Okay, great. I think we'll have a
look at that again. Thanks.
Herald: All right. Microphone 6, do you
have a short question?
Mic 6: Yeah, hopefully. It is known that
it is not very easy to understand how
large quantum superposition goes into a
macroscopic state. So in the macroscopic
physical description. So apparently there
are a couple of things not understood. So
is there anything you know about when you
go two thousand, ten thousand, million
Qubits, could you expect the quantum
behavior to break down? Are there any
fundamental argument that this will not
happen or is this not a problem considered
recently?
Andreas: Huh, Okay. I'm not sure if I
fully understand the question. It's mostly
about like if you say like quantum
mechanics or some like scale variance so
that if you go to a certain scale and some
time, at some point you have like a
irreversibility or like a something like
that. Yeah. I mean, I think that a large
quantum systems that occur naturally, I
don't know. I like Bose Einstein
condensate, for example, has a lot of
degrees of freedom that are not
controlled, of course, but that also
quantum mechanical and there it seems to
work. So personally, I would think that
there is no such limit. But I mean, who
knows? It's like that's why we do like
experimental physics. So we will see as if
we reached it. But from like the theory of
quantum mechanics right now, there is no
indication that this should be such a
limit to my knowledge.
Herald: All right, so maybe we will see
you again in five years.
Andreas: Yeah.
Herald. So please thank Andreas, until I
ask once again. Thanks.
Applause
36c3 postroll music
Subtitles created by c3subtitles.de
in the year 2020. Join, and help us!