Frequency Matters Podcast: Quantum Computing with Dave Slack

Summary

Gary Lerude of Microwave Journal recently spoke with Dave Slack, Engineering Director at Times Microwave Systems®, on quantum computing. Watch the complete video or read the session notes below.

Session Notes

What is quantum computing?

Quantum computing has been in development for a couple of decades now but it’s really starting to heat up for several reasons. I compare it to classical computing. With classical computing we’re all familiar with we use ones and zeros and binary data. Quantum computing works using the quantum properties of atomic and subatomic matter. It uses those very strange quantum properties and it’s pretty interesting technology.

I tend to look at new technologies like this in a historical context because I find you can look at what has happened and play that forward and get a pretty good understanding of how things are going to play out in the upcoming years. If you look back at the development of classing computing right in 1951, the U.S. Census Bureau took delivery of a machine called Univac. This Univac was room sized, it was sixty thousand pounds, 5,000 vacuum tubes, and it could perform calculation at a massive 1,000 calculations per second. We processed that as a kilohertz. Today we have hundreds of megahertz computing power in our pockets that we’re just carrying around with us so quite a gap has been covered in that period. From 1951 with the vacuum tubes, in the late 50s transistors were introduced; in the 60s and 70s those transistors were integrated into integrated circuits. 1980s microprocessors came along and computers got more and more powerful and they got smaller and smaller and smaller. 1965 a gentleman named Gordon Moore put out Moore’s law, that the transistor density of chips would double every year and that has that has been true from 1965 until about 2010.

In 2010 Moore’s Law started to trail off, meaning that the technology behind classical computing is starting to reach it’s the limits of physics and the limits of economics. To get the computing power that you know many industries need is going to be increasingly less possible with classical computing. Classical computing, as we mentioned earlier, uses very discrete ones and zeros. No ambiguity. It’s either/or, nowhere in the middle. It’s very predictable, it’s very reproducible. Quantum physics uses the properties of quantum, of quantum mechanics, which is the mechanics of matter very, very microscopic scales.

The basic element for computations is called a quantum bit or qubit. Qubit can be it can be a one or a zero, but it actually shouldn’t be thought of as a one and a zero because it can be both which is kind of which is kind of weird when you think about it. And quantum mechanics also use the principle of entanglement, where one qubit can be in a state and it will be coupled with another qubit their states will kind of mimic each other without any physical connection, which is super weird.

When I try to get my head around how can a bit of data be a one and a zero that’s really hard for me to grasp, so I think of it, being a microwave guy, I think of it in terms of noise. I think about that spectrum analyzer with no inputs and you set your filters up and you get noise for down 130, 140 dbm, and you see the randomness, you just see random noise. And you could park a market at any frequency you want to and you’re going to get this range of noise values. You know, they’re going to be all over the place. So you cannot say that if I put a marker at five gigahertz, I am going to measure power X. it’s just not that deterministic like you would expect in a classical computing world. Being a one or a zero, what you’re going to see is a range of numbers and if you capture those numbers over time you can describe them statistically. You will have X probability of measuring Y value every time you sample. It’s about the best you can do. And that’s the way I think of a quantum bit or qubit being a one and a zero. If you’re controlling it and it should be in the one state you know statistically it’s going to tend to be in the one state but it could be in a zero or anywhere in between. It’s only what it is when I measure it is what matters. Quantum computing takes advantage of both of those properties, the superposition properties of the one and the zero and the entanglement properties. The entanglement kind of gives you true parallel processing, so whereas classical computing power increases linearly as a function of computing power, the number of transistors and number of bits, quantum computing increases exponentially with the number of qubits so they can be massively more powerful with every time you add on another bit. They have the ability to be super powerful with a fraction of the energy usage.

So I’m with you so far, that makes sense. I like your analogy of the kind of noise floor on a spectrum analyzer to try to understand the probabilistic nature. If we think about more practical terms, you talked about the power of quantum computing, but what would a quantum computer be able to do?

First of all what they will not do is replace classical computing. We’re always going to have that. That’s always going to be the best tool for many, many calculations. What quantum computing will do is it allows the use of different bits. It uses different algorithms and works totally differently and what it will allow you to do is take problems that have a massive number of inputs and where these inputs are not always discreetly defined they’re more statistical inputs, and it will allow you to process problems like that because that’s the way the machines thinks and works.

The use cases that are seen for this is anything where a large number of complex numbers are involved, like cybersecurity, security, banking security, financial and economic modeling. If you try to get two economists to agree on what the economy is going to do their models are limited in what they can do so the quantum computer is really geared towards that kind of a complex problem with thousands and thousands of inputs into it. Predicting the weather, you can never get two weathermen to agree on what the weather is going to do. It’s super complicated modeling climate change. All these super complex models are where I think quantum computing has a fit.

I think artificial intelligence is just going to explode when it has access to this tool. And the other thing is you know aerodynamic and thermodynamic modeling, especially with hypersonic weapons, that modeling of those thermodynamics and the aerodynamics at those speeds and velocities aren’t well known, so today they’re doing a lot of physical testing to kind of understand. They run models that takes weeks to complete. Having a quantum computer to run those models would be way less physical testing and they could run more models much more quickly would be super valuable.

That makes sense, a lot of power involved there particularly with very complex problems. Now, we hear that there’s a connection between quantum computing and microwave engineering. How is microwave engineering involved in quantum computing?

That’s the question for this audience, I’m sure. So how do we play in this? The qubits, you can think of the qubits as a microwave resonator like an LC tank or something, and you can drive these qubits from a zero state to a one state, to one energy level zero or a one, by driving it with a microwave signal. It’s a resonant and you drive it with a signal at that frequency of resonance and you’re going to change the energy level within that resonator. Under this driven condition the probabilities of being a one or a zero vary sinusoidally with time and it could be controlled too much the way other signal can be controlled. It’s important to know that like other signals, the qubits have magnitude and a phase relationship. They’re complex signals, so it’s very familiar to the microwave community these concepts.

One of the limiting factors in quantum computing is when you have this resonator under this driven condition, it’s kind of predictable and controlled and is only able to be maintained for a certain period of time because with any resonator there’s losses and there’s things affect it that cause it to lose energy and stop resonating. And that’s the limiters here. That is called the correlation of the qubit. When these things become de-correlated they’re no longer predictable, they’re no longer controlled, and that’s analogous to bit errors in data, so you’d have computational issues. The correlation and the control of the qubits is one of the real driving issues behind the technology development. What it really boils down to is the noise comes from thermal noise, it comes from magnetic noise, and it can come from mechanical noise, vibrations, things like that. Microwave hardware that can feed these resonators and minimize these contaminations. In fact, one of the prime limitations is super low noise driving signals for the qubits, especially low phase noise, so a lot of work is going on to ultra low phase noise oscillators and things like that.

The way to have a computer, you need more than one bit so you have two bits or multiple bits; when you have two of these bits they can be coupled together and they can be controlled by this driving signal which is at a microwave frequency, and it can be amplitude and phase modulated to give it certain properties. Both qubits can be modulated separately, and then using the wave properties of the two you can get it to perform in certain ways and the wave properties actually interfere. It’s very analogous to interferometry that antenna and radar people do.

I think of it again in terms of that classic high school demonstration of the two slits and the laser beam and you get the interferometry pattern, you get areas where the two wave patterns can interfere constructively and then others where they interfere destructively. You get kind of high-density probabilities and low density probabilities. You get essentially the ones and zeros and those can be controlled. All of the hardware that’s used to control these qubits and this coupling, I’ll introduce these noises, the thermal, the magnetic, the vibrational noise, so aside from the low noise sources of these driving signals in precise modulation schemes it’s hardware that minimizes that contamination. This is where I think a lot of the microwave community can support.

I think one of the requirements to maintain low noise is to have very low temperatures. Is that correct?

Absolutely. The resonators, the lump constant model is an inductor in parallel with a capacitor. And if you have that you have this perpetual motion machine that just the magnetic flux collapses drives the capacitor which then charges and then discharges and repeat. That would go on forever on paper but in reality there’s losses, these resistive losses, so you get kind of a damp sine wave unless you feedback some of that energy with this driving signal. Being at cryogenic temperatures, below 4 miliKelvin, which is really close to absolute zero, which also astounds me that we can get things that cold, but you minimize resistive losses. It’s virtually zero so you don’t have those losses. The noise floor is less so thermal noise is less. The quantum computing actually wants to happen in a really hard vacuum at super cold temperatures and totally shielded from the Earth’s magnetic field. It’s a very pristine place that these computations want to take place.

You’ve convinced me that there’s a lot of potential here. What’s the actual state of the technology and capability today? I know there’s a lot of research going on, but where are we?

The way I see it, we are in the period before Univac was delivered to its customer. We don’t quite have that Univac machine, that big room-sized monster. The first time commercial quantum computers are going to be quite analogous to that and I think over time they’re going to do what classic computers have done, they’re going to get smaller and faster and more accurate very quickly. I think in the decades that have spanned between then and now, I think that development time frame is going to be a fraction of what it was. I think the next five years are going to be astounding. Next year is going to be a big year and the next ten years is just going to be amazing at where the development is.

Google is spending billions of dollars on this and they have recently just claimed supremacy; they’ve proclaimed there’s a computer problem that Google’s quantum computer has solved in 200 seconds that they claim that IBM Summit machine would take 10,000 years to solve which is an astounding claim. IBM of course disputes this. IBM says that now it would take our Summit computer two and a half days to solve this problem, so by IBM’s own numbers it’s taken a two and a half day solution can now be done in 200 seconds which a thousand to one improvement. And this is on that pre, this is on that lab level computer. So if we can get a thousand time improvement of our quantum Univac , the next five or ten years I mean only the imagination is our limits. I think it’s going to be a really interesting ride and I’m really looking forward to seeing it unfold.

Author img

About the author

David Slack Director of Engineering

David Slack is director of engineering at Times Microwave Systems. He has extensive experience in the development of high-performance coaxial cable interconnects and related technologies. He received a Bachelor of Science in electrical engineering from Fairfield University.