Podcasts

Podcast with Bob Sorensen - Senior VP of Research at Hyperion Research

13
July
,
2021

My guest today is Bob Sorensen, Senior VP of Research at Hyperion Research.

Bob and I talk about what performance improvements customers expect from quantum computers, the mistakes that quantum equipment vendors might be making, and much more.

Listen to additional episodes by selecting 'podcasts' on our Insights page

You can also subscribe using your favorite streaming service

The full transcript is below

Yuval Boger (CMO, Classiq): Hello, Bob, and thanks for joining me today.

Bob Sorensen (Senior VP of Research, Hyperion Research): Hi, Yuval, how are you doing? And thanks for inviting me to speak on your podcast today.

Yuval: So, who are you, and what do you do?

Bob: My name is Bob Sorensen. I'm the Chief Quantum Computing Analyst at a research company called Hyperion Research. And just to give you a little bit of background of the company (this isn't an advertisement as much as it is an explanation of our pedigree), Hyperion Research and the small number of folks who work there basically come out of the high-performance computing sector. Most of us have at least 30 years of experience in high-performance computing: the traditional Cray architectures, the high-performance systems, the things you see on the top 500 list of the most powerful HPCs in the world.

About five years or six years ago, we started to get many customer questions. Now, remember our customers are HPC vendors and HPC users, and they began to say, "We hear interesting things about quantum computing. What are the potential challenges? What are some of the things that we need to be preparing for," were the questions we started to get? So we started looking at quantum not from the sake of quantum for its own purposes, but how it's going to integrate itself into the overall advanced computing architecture, whether it be HPCs on-prem, whether it be HPC in the cloud, or whether it be perhaps things like post CMOS architectures such as neural computing or DNA storage.

So we looked at it as more of an aspect of advanced computing and tried to figure out exactly how it would fit into the overall advanced computing capabilities. Can users think about using quantum computing to address some of their interesting and challenging workloads that right now are on their classical systems? What do vendors need to do to prepare and to think about? Should they be offering products? Should they be partnering with quantum computing suppliers? How is the calculus of their particular product development stream going to change? And so, Hyperion Research has been involved in that journey, and we've spent a lot of time establishing some very good relationships with a lot of quantum computing suppliers and, happily, end-users around the world to try to add some sense of context here.

Quantum computing is not an island. It exists as an advanced computing element and the ability to talk to users and suppliers across the advanced computing ecosystem and say, "This is where quantum computing fits in. I think it's a significant value-added at this time." I say that because there is so much hype, misinformation, and confusion about what quantum computing could bring to the table. In essence, I view one of my primary roles as being a voice of reason in what's going on out there to sort out some of the issues. And that's why I'm here today, and that's really been my mission for the last five years.

Yuval: I'm curious what the answer is. How does quantum computing fit in the high-performance computing infrastructure?

Bob: Well, it's interesting because if we could turn back the clock on some of the rhetoric, there are two points I'd really like to stress. The first one is, I wish we hadn't named it quantum computing. I wish we had called it quantum accelerators because really for the near term and I think for perhaps the foreseeable future in terms of quantum computing, really what the quantum computing brings to the table and the advanced computing over scheme is the ability to offer some very significant performance improvements on a very narrow range of applications.

As I say at some of the ends of my talks, no one will ever say, "Let me go check my quantum computer to see if I have any new email." It's not a general-purpose replacement. I don't want to belabor the analogy too much, but I like to think of it as what happened when GPUs came along: very interesting use cases but not a general-purpose solution. They accelerate specific jobs: artificial intelligence, machine learning, deep learning applications are great examples, but they're not the end-all-be-all to solve all use cases for all users. But for specific applications, they offer significant performance gains, and that's really where I think quantum computing needs to be. So that's the first issue I like to talk about.

The second one is this dichotomy that I see in the sector right now between what the quantum computing suppliers are thinking and what the quantum users (or potential users in many cases) are. If you read the popular press, you'll see phrases like “quantum advantage” or “quantum superiority.” The idea that the quantum computing community, the research aspects specifically, look for applications that have this demonstrated supremacy.

The idea that I have an application that I can only do on a quantum computer because if I run it on a classical system, it may take 15 billion years, where I can do it on a quantum system in, say, under an hour. Well, that's the supremacy issue, and that's the one that I think the sector is concentrating a bit too heavily on, mainly because that is a relatively far-off goal. Frankly, I think it is laudable, but I don't see it as the main point when I talk to users. Remember, those are guys that are looking at this primarily as accelerators.

And when I ask users, what performance gains would you want out of a quantum accelerator to justify you putting one in the basement or logging onto a cloud system? We found that almost half said 50x. Give me 50x performance gain. They're not interested in quantum supremacy. They're interested in a performance boost of 50x. Some of them, a smaller number, say 1/4 if I remember correctly, said, "If you give me 10x as a single order of magnitude performance increase, I'm going to be really happy about that."

And what that tells me is that the users have much more modest performance expectations. They're not looking for new applications that heretofore were intractable in classical systems. What they're looking for is about a four to five-year leap forward in progress. Because if you look at 50x as a performance number, that's where the classical world at the high end generally moves to. If you want a 50x performance gain, you wait about four to five years. That's all they're looking for. And that tells me they're not looking for unheard-of levels of performance, they're looking for a competitive advantage against their non-quantum counterparts, and they want to turn that into some economic advantage. And so this dichotomy in sometimes between what the quantum computing suppliers ... I think they've almost forgotten sometimes who they should be selling to. They're too keyed up with the internal competition. Who's got the most qubits? Who's got the biggest quantum volume? Who's approaching quantum supremacy? Whereas the users are just saying, "Give us more realistic performance enhancements, and we'll be very happy to write the checks for you guys."

Yuval: But 10x or 50x performance could definitely be game-changers. I remember speaking with Honda several years ago when they were doing car design using clay models, and it would take them six weeks to make a model and then get the executives to ask for feedback. Suddenly, with 3D modeling and virtual reality, they could do this cycle in a day and could roll out new cars or more innovative cars in the market much faster. So wouldn't you say that 10x or 50x is a game changer even if that's all quantum delivers?

Bob: That's the thing. That's all the users are interested in. And I think a 50x leap is significant for at least 50% of the end-users right now they would love to have that. And it's a dichotomy between some of the suppliers I talk to. And I talked to a company a couple of days ago, and I won't mention their name even though it was a press briefing, and their philosophy was: we're ten years away because until we hit a million qubits, we don't see the financial return. So, I said, "There's a risk in terms of sitting on the sidelines for the next decade when there's so much interesting work to be done." But again, there's this perception that if you're not offering these unimaginable speed-up levels that it's not worth playing. And I think the users aren't expecting that.

I think that the companies out there working today should establish two major things here. The first one is a sense of progress. So if you look at roadmaps, and I would point to say IBM, or Honeywell, or Rigetti, some of the hardware vendors out there of note if you will, they have already published roadmaps that say, "We may not be there yet, but we are demonstrating progress on a regular cadence. We're doing something that says: ‘we will get there eventually,’ and they're trying to build confidence in what the users are thinking." And to me, that competence-building measure is critical.

The other thing those companies are doing is they are working more and more with end-users to develop applications. It's one thing to come up with a quantum algorithm, and it's another thing to take that algorithm and apply it to a use case in a specific vertical that matters. And that's really the next big step in quantum. I can develop an algorithm; I can develop a piece of software, sell a piece of hardware, but when I want to sit down with an oil and gas company and say, "I have a really interesting way to optimize how you extract oil from one of your large fields to a discovery, so it's an optimal extraction technique. The most oil comes out of the least possible cost." That's when end users sit up and take notice.

And moving from the abstract concept of algorithms to viable end-use cases that are demonstrated, or what some of the progressive companies in the quantum computing sector are doing today, they're not saying, "Look at our speed-up." They're not saying, "Look at how many qubits we have." They're saying, "We're exploring how we can save a company X amount of dollars because we can optimize their logistics process. We can optimize the flow of materials through a factory to push steel out in a more effective way." They're optimizing things like how I can do risk assessments of portfolios in near real-time.

So if I have a portfolio of 1,000 instruments, investments, and one or two things happen to go sideways at some point, how can I quickly decide how that affects my overall portfolio risk exposure? And if I can do that on a quantum system effectively, that beats a classical system, perhaps only by a factor of 10. Well, if I have to wait one hour for that data versus 10 hours, that means a lot to me as an investor. That's a demonstrated use case that drives continued interest in the technology. Because right now, we're at a point where if you look at a lot of what's going on in quantum computing, the major funding still comes from venture capital organizations and from government programs. And those guys, certainly the venture capitalists, the lesser extent The government programs, they like demonstrated progress. They want results. They want a return on investment. Politicians can be just as fickle as VCs when it comes to money.

And for those of us who are old enough to remember the AI winter in the late '80s where we saw AI having huge promise, huge investment, difficulty in delivering results, and then it just went away, and that was the AI winter. Freezing of all work. It took almost 20 years really for that sector to revitalize itself. And it took new programming, new hardware paradigms, a bunch of issues there. There is a looming specter. I don't want to say it's a high probability event of a similar winter in quantum. Too much investment, too much promise without near-term delivery could spell concerns for funding issues.

I've talked to several companies, and I asked them about the specter of quantum winter coming, and the best answer I got was from one CEO who basically said, "I'm not predicting if quantum winter will happen or not, just be assured that I'm preparing for one if it does happen." So there is this need to demonstrate that this is a vibrant and growing sector with promise in order to keep investment and R&D funds available until it reaches a point where it's self-sustaining. Revenue drives research which drives revenue, and so on. And that's what the sector is trying to push itself towards.

Yuval: Let me take the other side of the evolution versus revolution argument. New technology has come in, and sometimes you don't know what they're going to be used for. I mean, when you think about the Internet being invented for one application, and what is it being used today? What we though cell phones were going to do, and what are we doing today? If you think about a quantum computer even with 500 qubits, you cannot simulate 500 qubits today. And because you cannot simulate those, it's really difficult to build algorithms that take advantage of them. But if in a couple of years, you'll have 500 or 1000 or more qubits, then who knows what revolutionary algorithms will be available for those. And it is if you believe the vendors, two years away.

Bob: The beauty of the sector right now is there's so much work going on at every level in the stack. I see hardware innovation, and we call them modalities—a different way to achieve the quantum phenomena. You have superconducting qubits; you have trapped ions, you have photonics, you also have neutral atoms. You have all sorts of interesting ways. They each have their own strengths and weaknesses, but the point is it's a vital sector. Anybody who says they know exactly which is the winning modality right now, I view that with a certain amount of suspicion. We really don't know who's going to win yet, but there's vitality there. There's lots of enthusiasm, lots of research, and we see that across the stack. We see that in applications, and algorithms, and middleware, the idea that people are out there looking at quantum compilers. The fact that you can write a quantum program that sits high enough in abstraction that you can be a Python programmer and make some quantum computing calls and not really care about what happens at the quantum level.

It's the fact that most Python programmers couldn't really explain to you how a CMOS gate works in a transistor of an Intel processor. That's where we're moving. And that I think is the interesting part of this, is that the sector can move in parallel. So we can have applications development and new algorithms and, in some sense, wait for the hardware to catch up with some of those ideas.

But the point here that's interesting is that in the meantime, some interesting work can be done. People are looking at noise-tolerant applications. Some of the frailties of quantum computing can be mitigated by mixing applications that say, "I don't need a perfect answer; I just need a better answer." That's why optimization right now is such a buzzword in the quantum computing space because I don't need to have the optimal traveling salesman solution; I just need to produce a solution that's perhaps marginally better than what a classical counterpart could do at any given point in time. So again, it's the idea of demonstrating progress as opposed to reaching a holy grail; we can all sit back now because the quantum is going to do everything we ever hoped it would. It's going to be a journey, and it's going to take some time for all that to unfold, but the applications, the algorithms, the hardware, the software, it's all evolving almost in parallel. And to me, I find that to be a very promising aspect of all this.

Yuval: When you advise companies about quantum computing, do you recommend that they start from the cloud or on-prem? And is the answer different for a commercial organization versus a Federal one?

Bob: That's a great question because it's really difficult right now because I'm a fan of letting 1,000 flowers bloom. The idea of throwing your software on a cloud, say it be AWS, or Google, or Microsoft, or something. If you've got a very interesting but somewhat complicated algorithm, I question, how can you demonstrate and differentiate your product if you are one of 450 different quantum options on, say, a cloud service provider? It's hard to differentiate; it's hard to stand out. So if I'm talking to a very small company, perhaps a handful of programmers and such, with an interesting idea, the first thing is don't just throw it on the cloud that has a low barrier to entry. You can put it up there, but if it's an interesting algorithm or requires a certain amount of education on the customer side, how do you differentiate?

So doing it that way can be a problem. The issue of the sector consolidating right now is an interesting one. We're seeing more and more partnerships being formed. So, you look at software companies tying up with hardware companies as a way to offer up a single unified solution that still doesn't require one single entity to be full-stack. It's difficult to be full-stack. You need a lot of resources. IBM is probably the most successful full-stack company, but if you look at their R&D budget and their financial and technical wherewithal, they can support that effort. Not a lot of companies have the breadth and depth of resources to do that. So in some sense, it's really more about making sure that you understand where you fit into the sector and examine all the options. And don't just pick one.

I love the fact that, as I said because the stack is somewhat mature right now if I'm a software company, I don't have to say, "Do I want to be a Rigetti software company? Do I want to be a D-Wave company? Do I want to be a Honeywell company?" I can write code that runs on all that hardware. So don't limit yourself to a single thread just yet because the jury is still out. It's still out on modalities, it's still out on hardware, it's still out on architecture, and it's still out on corporate winners and losers. So be flexible, be broad-based, and look for opportunities where they arise.

Yuval: And as we get closer to the end of our conversation today, what do you think is missing other than bigger hardware as you go from the 20, 30, 40 qubits of today to 1,000 qubits tomorrow? Where do you think the biggest improvement needs to happen?

Bob: To me, the two biggest improvements, and I'll go a little nerdy here. I'm an electrical engineer by training. The two big ones are scalability. I like that people can make qubits. But if you think about how a traditional system is made today, you don't have one big honking microprocessor, you have 100,000 of them, and you hook them together. Scale it. You scale up to get this capability. Well, the issue within the classical world is, if you've got a million processors or a million cores and 100,000 processors, they have to communicate with each other, and the network is what limits performance nowadays. It's not the processor itself. It's the ability of the processors that talk to each other. I'm not clear yet which particular modalities and which architectures scale better. So to me, we're not going to reach a single chip that has a million qubits in it. We're probably going to have 1,000-qubit chips, and 1,000 of them communicate with each other. That's going to be the architecture of choice in my mind.

So, you have to look at which modalities do not offer the best single qubit but offer the qubit modality that allows you to scale in an interesting way. Because if you've got 1,000 chips and they're all quantum, if you can't communicate effectively in the quantum realm and you have to drop down to the classical world to run communications, it's dead; you're dead in the water. That's a particular problem. The other one that I think is important is I/O. It's difficult to communicate with a quantum computer, so there are all sorts of instrumentation and control processes that are all classical IT, but the architecture needs to support that.

How can I talk to that quantum processor in a timely way? Give it lots of instructions because the wonderful dichotomy of quantum computing is most of the time, you don't want to touch it. You want to maintain that quantum goodness. So you don't want the outside environment impinging on it at all except when you do. When you want to get in, and you want to shove data down there, you want it to do with quantum goodness, and then you want to get the data out. So half the time or some percentage of the time, you want to leave it alone. The other half, you want to be all over it so you can communicate with it.

I/O I think is another vexing problem because right now, it takes an awful lot of control lines to control every qubit. Now, a typical transistor has billions of transistors, a typical microprocessor, an advanced one. Billions of transistors, not billions of I/O lines. It'd be strange to have a laptop that had that. The quantum world must deal with the fact that you now have to figure out how I get complex signals into a qubit or quantum processor without having too much onerous overhead in the I/O capabilities? So, scalability and I/O, I think, are the next big technical challenges from a hardware perspective.

Software perspective, demonstrated use cases. I'm waiting for the people to come along and say, "We just saved X billion dollars this year because we figured out an optimization routine for loading cargo on an aircraft, and now we can do it more effectively in real-time, and this just saved us X amount of dollars on flight delays and fuel costs and such." More demonstrated use cases, but that someone, a budget director, can bring to the C-suite and say, "You don't want to hear about quantum. You want to hear about the fact that I just saved you $50 million this year." And that's from the corporate side, that I think is the next step, the next trend that I'd like to see happen more often.

Yuval: Excellent. So Bob, how can people get in touch with you to learn more about your work?

Bob: Well, Hyperion Research, we do have a website, and we're pretty out there. And my name is Bob Sorensen, so if you just type in bsorensen@hyperionres.com, I'm there. And the interesting thing is, because this is such a new subject I have this wonderful responsibility of not only doing things that keep the lights on in terms of payment, but also consulting and talking to people, and just gathering information and such. So in many cases, all I like to do is just chat with folks to hear their hopes, and their dreams, and their fears so I can present a more accurate portrayal of how the sector's unfolding.

So it's really more about having conversations at this point than it is me selling a glossy brochure-driven document set at this point. So reach out, contact. I have a number of decks. I'm more than happy to answer questions and give presentations going down the road. But with that said, something like this to me is just fascinating in terms of getting the word out, and I appreciate the invitation today. And, that you actually let me talk, I think, for over 20 minutes, which shows how absolutely patient you can be, Yuval.

Yuval: Well, it was my pleasure having you today. Thanks very much for joining.

Bob: Great. All right. Thank you.


My guest today is Bob Sorensen, Senior VP of Research at Hyperion Research.

Bob and I talk about what performance improvements customers expect from quantum computers, the mistakes that quantum equipment vendors might be making, and much more.

Listen to additional episodes by selecting 'podcasts' on our Insights page

You can also subscribe using your favorite streaming service

The full transcript is below

Yuval Boger (CMO, Classiq): Hello, Bob, and thanks for joining me today.

Bob Sorensen (Senior VP of Research, Hyperion Research): Hi, Yuval, how are you doing? And thanks for inviting me to speak on your podcast today.

Yuval: So, who are you, and what do you do?

Bob: My name is Bob Sorensen. I'm the Chief Quantum Computing Analyst at a research company called Hyperion Research. And just to give you a little bit of background of the company (this isn't an advertisement as much as it is an explanation of our pedigree), Hyperion Research and the small number of folks who work there basically come out of the high-performance computing sector. Most of us have at least 30 years of experience in high-performance computing: the traditional Cray architectures, the high-performance systems, the things you see on the top 500 list of the most powerful HPCs in the world.

About five years or six years ago, we started to get many customer questions. Now, remember our customers are HPC vendors and HPC users, and they began to say, "We hear interesting things about quantum computing. What are the potential challenges? What are some of the things that we need to be preparing for," were the questions we started to get? So we started looking at quantum not from the sake of quantum for its own purposes, but how it's going to integrate itself into the overall advanced computing architecture, whether it be HPCs on-prem, whether it be HPC in the cloud, or whether it be perhaps things like post CMOS architectures such as neural computing or DNA storage.

So we looked at it as more of an aspect of advanced computing and tried to figure out exactly how it would fit into the overall advanced computing capabilities. Can users think about using quantum computing to address some of their interesting and challenging workloads that right now are on their classical systems? What do vendors need to do to prepare and to think about? Should they be offering products? Should they be partnering with quantum computing suppliers? How is the calculus of their particular product development stream going to change? And so, Hyperion Research has been involved in that journey, and we've spent a lot of time establishing some very good relationships with a lot of quantum computing suppliers and, happily, end-users around the world to try to add some sense of context here.

Quantum computing is not an island. It exists as an advanced computing element and the ability to talk to users and suppliers across the advanced computing ecosystem and say, "This is where quantum computing fits in. I think it's a significant value-added at this time." I say that because there is so much hype, misinformation, and confusion about what quantum computing could bring to the table. In essence, I view one of my primary roles as being a voice of reason in what's going on out there to sort out some of the issues. And that's why I'm here today, and that's really been my mission for the last five years.

Yuval: I'm curious what the answer is. How does quantum computing fit in the high-performance computing infrastructure?

Bob: Well, it's interesting because if we could turn back the clock on some of the rhetoric, there are two points I'd really like to stress. The first one is, I wish we hadn't named it quantum computing. I wish we had called it quantum accelerators because really for the near term and I think for perhaps the foreseeable future in terms of quantum computing, really what the quantum computing brings to the table and the advanced computing over scheme is the ability to offer some very significant performance improvements on a very narrow range of applications.

As I say at some of the ends of my talks, no one will ever say, "Let me go check my quantum computer to see if I have any new email." It's not a general-purpose replacement. I don't want to belabor the analogy too much, but I like to think of it as what happened when GPUs came along: very interesting use cases but not a general-purpose solution. They accelerate specific jobs: artificial intelligence, machine learning, deep learning applications are great examples, but they're not the end-all-be-all to solve all use cases for all users. But for specific applications, they offer significant performance gains, and that's really where I think quantum computing needs to be. So that's the first issue I like to talk about.

The second one is this dichotomy that I see in the sector right now between what the quantum computing suppliers are thinking and what the quantum users (or potential users in many cases) are. If you read the popular press, you'll see phrases like “quantum advantage” or “quantum superiority.” The idea that the quantum computing community, the research aspects specifically, look for applications that have this demonstrated supremacy.

The idea that I have an application that I can only do on a quantum computer because if I run it on a classical system, it may take 15 billion years, where I can do it on a quantum system in, say, under an hour. Well, that's the supremacy issue, and that's the one that I think the sector is concentrating a bit too heavily on, mainly because that is a relatively far-off goal. Frankly, I think it is laudable, but I don't see it as the main point when I talk to users. Remember, those are guys that are looking at this primarily as accelerators.

And when I ask users, what performance gains would you want out of a quantum accelerator to justify you putting one in the basement or logging onto a cloud system? We found that almost half said 50x. Give me 50x performance gain. They're not interested in quantum supremacy. They're interested in a performance boost of 50x. Some of them, a smaller number, say 1/4 if I remember correctly, said, "If you give me 10x as a single order of magnitude performance increase, I'm going to be really happy about that."

And what that tells me is that the users have much more modest performance expectations. They're not looking for new applications that heretofore were intractable in classical systems. What they're looking for is about a four to five-year leap forward in progress. Because if you look at 50x as a performance number, that's where the classical world at the high end generally moves to. If you want a 50x performance gain, you wait about four to five years. That's all they're looking for. And that tells me they're not looking for unheard-of levels of performance, they're looking for a competitive advantage against their non-quantum counterparts, and they want to turn that into some economic advantage. And so this dichotomy in sometimes between what the quantum computing suppliers ... I think they've almost forgotten sometimes who they should be selling to. They're too keyed up with the internal competition. Who's got the most qubits? Who's got the biggest quantum volume? Who's approaching quantum supremacy? Whereas the users are just saying, "Give us more realistic performance enhancements, and we'll be very happy to write the checks for you guys."

Yuval: But 10x or 50x performance could definitely be game-changers. I remember speaking with Honda several years ago when they were doing car design using clay models, and it would take them six weeks to make a model and then get the executives to ask for feedback. Suddenly, with 3D modeling and virtual reality, they could do this cycle in a day and could roll out new cars or more innovative cars in the market much faster. So wouldn't you say that 10x or 50x is a game changer even if that's all quantum delivers?

Bob: That's the thing. That's all the users are interested in. And I think a 50x leap is significant for at least 50% of the end-users right now they would love to have that. And it's a dichotomy between some of the suppliers I talk to. And I talked to a company a couple of days ago, and I won't mention their name even though it was a press briefing, and their philosophy was: we're ten years away because until we hit a million qubits, we don't see the financial return. So, I said, "There's a risk in terms of sitting on the sidelines for the next decade when there's so much interesting work to be done." But again, there's this perception that if you're not offering these unimaginable speed-up levels that it's not worth playing. And I think the users aren't expecting that.

I think that the companies out there working today should establish two major things here. The first one is a sense of progress. So if you look at roadmaps, and I would point to say IBM, or Honeywell, or Rigetti, some of the hardware vendors out there of note if you will, they have already published roadmaps that say, "We may not be there yet, but we are demonstrating progress on a regular cadence. We're doing something that says: ‘we will get there eventually,’ and they're trying to build confidence in what the users are thinking." And to me, that competence-building measure is critical.

The other thing those companies are doing is they are working more and more with end-users to develop applications. It's one thing to come up with a quantum algorithm, and it's another thing to take that algorithm and apply it to a use case in a specific vertical that matters. And that's really the next big step in quantum. I can develop an algorithm; I can develop a piece of software, sell a piece of hardware, but when I want to sit down with an oil and gas company and say, "I have a really interesting way to optimize how you extract oil from one of your large fields to a discovery, so it's an optimal extraction technique. The most oil comes out of the least possible cost." That's when end users sit up and take notice.

And moving from the abstract concept of algorithms to viable end-use cases that are demonstrated, or what some of the progressive companies in the quantum computing sector are doing today, they're not saying, "Look at our speed-up." They're not saying, "Look at how many qubits we have." They're saying, "We're exploring how we can save a company X amount of dollars because we can optimize their logistics process. We can optimize the flow of materials through a factory to push steel out in a more effective way." They're optimizing things like how I can do risk assessments of portfolios in near real-time.

So if I have a portfolio of 1,000 instruments, investments, and one or two things happen to go sideways at some point, how can I quickly decide how that affects my overall portfolio risk exposure? And if I can do that on a quantum system effectively, that beats a classical system, perhaps only by a factor of 10. Well, if I have to wait one hour for that data versus 10 hours, that means a lot to me as an investor. That's a demonstrated use case that drives continued interest in the technology. Because right now, we're at a point where if you look at a lot of what's going on in quantum computing, the major funding still comes from venture capital organizations and from government programs. And those guys, certainly the venture capitalists, the lesser extent The government programs, they like demonstrated progress. They want results. They want a return on investment. Politicians can be just as fickle as VCs when it comes to money.

And for those of us who are old enough to remember the AI winter in the late '80s where we saw AI having huge promise, huge investment, difficulty in delivering results, and then it just went away, and that was the AI winter. Freezing of all work. It took almost 20 years really for that sector to revitalize itself. And it took new programming, new hardware paradigms, a bunch of issues there. There is a looming specter. I don't want to say it's a high probability event of a similar winter in quantum. Too much investment, too much promise without near-term delivery could spell concerns for funding issues.

I've talked to several companies, and I asked them about the specter of quantum winter coming, and the best answer I got was from one CEO who basically said, "I'm not predicting if quantum winter will happen or not, just be assured that I'm preparing for one if it does happen." So there is this need to demonstrate that this is a vibrant and growing sector with promise in order to keep investment and R&D funds available until it reaches a point where it's self-sustaining. Revenue drives research which drives revenue, and so on. And that's what the sector is trying to push itself towards.

Yuval: Let me take the other side of the evolution versus revolution argument. New technology has come in, and sometimes you don't know what they're going to be used for. I mean, when you think about the Internet being invented for one application, and what is it being used today? What we though cell phones were going to do, and what are we doing today? If you think about a quantum computer even with 500 qubits, you cannot simulate 500 qubits today. And because you cannot simulate those, it's really difficult to build algorithms that take advantage of them. But if in a couple of years, you'll have 500 or 1000 or more qubits, then who knows what revolutionary algorithms will be available for those. And it is if you believe the vendors, two years away.

Bob: The beauty of the sector right now is there's so much work going on at every level in the stack. I see hardware innovation, and we call them modalities—a different way to achieve the quantum phenomena. You have superconducting qubits; you have trapped ions, you have photonics, you also have neutral atoms. You have all sorts of interesting ways. They each have their own strengths and weaknesses, but the point is it's a vital sector. Anybody who says they know exactly which is the winning modality right now, I view that with a certain amount of suspicion. We really don't know who's going to win yet, but there's vitality there. There's lots of enthusiasm, lots of research, and we see that across the stack. We see that in applications, and algorithms, and middleware, the idea that people are out there looking at quantum compilers. The fact that you can write a quantum program that sits high enough in abstraction that you can be a Python programmer and make some quantum computing calls and not really care about what happens at the quantum level.

It's the fact that most Python programmers couldn't really explain to you how a CMOS gate works in a transistor of an Intel processor. That's where we're moving. And that I think is the interesting part of this, is that the sector can move in parallel. So we can have applications development and new algorithms and, in some sense, wait for the hardware to catch up with some of those ideas.

But the point here that's interesting is that in the meantime, some interesting work can be done. People are looking at noise-tolerant applications. Some of the frailties of quantum computing can be mitigated by mixing applications that say, "I don't need a perfect answer; I just need a better answer." That's why optimization right now is such a buzzword in the quantum computing space because I don't need to have the optimal traveling salesman solution; I just need to produce a solution that's perhaps marginally better than what a classical counterpart could do at any given point in time. So again, it's the idea of demonstrating progress as opposed to reaching a holy grail; we can all sit back now because the quantum is going to do everything we ever hoped it would. It's going to be a journey, and it's going to take some time for all that to unfold, but the applications, the algorithms, the hardware, the software, it's all evolving almost in parallel. And to me, I find that to be a very promising aspect of all this.

Yuval: When you advise companies about quantum computing, do you recommend that they start from the cloud or on-prem? And is the answer different for a commercial organization versus a Federal one?

Bob: That's a great question because it's really difficult right now because I'm a fan of letting 1,000 flowers bloom. The idea of throwing your software on a cloud, say it be AWS, or Google, or Microsoft, or something. If you've got a very interesting but somewhat complicated algorithm, I question, how can you demonstrate and differentiate your product if you are one of 450 different quantum options on, say, a cloud service provider? It's hard to differentiate; it's hard to stand out. So if I'm talking to a very small company, perhaps a handful of programmers and such, with an interesting idea, the first thing is don't just throw it on the cloud that has a low barrier to entry. You can put it up there, but if it's an interesting algorithm or requires a certain amount of education on the customer side, how do you differentiate?

So doing it that way can be a problem. The issue of the sector consolidating right now is an interesting one. We're seeing more and more partnerships being formed. So, you look at software companies tying up with hardware companies as a way to offer up a single unified solution that still doesn't require one single entity to be full-stack. It's difficult to be full-stack. You need a lot of resources. IBM is probably the most successful full-stack company, but if you look at their R&D budget and their financial and technical wherewithal, they can support that effort. Not a lot of companies have the breadth and depth of resources to do that. So in some sense, it's really more about making sure that you understand where you fit into the sector and examine all the options. And don't just pick one.

I love the fact that, as I said because the stack is somewhat mature right now if I'm a software company, I don't have to say, "Do I want to be a Rigetti software company? Do I want to be a D-Wave company? Do I want to be a Honeywell company?" I can write code that runs on all that hardware. So don't limit yourself to a single thread just yet because the jury is still out. It's still out on modalities, it's still out on hardware, it's still out on architecture, and it's still out on corporate winners and losers. So be flexible, be broad-based, and look for opportunities where they arise.

Yuval: And as we get closer to the end of our conversation today, what do you think is missing other than bigger hardware as you go from the 20, 30, 40 qubits of today to 1,000 qubits tomorrow? Where do you think the biggest improvement needs to happen?

Bob: To me, the two biggest improvements, and I'll go a little nerdy here. I'm an electrical engineer by training. The two big ones are scalability. I like that people can make qubits. But if you think about how a traditional system is made today, you don't have one big honking microprocessor, you have 100,000 of them, and you hook them together. Scale it. You scale up to get this capability. Well, the issue within the classical world is, if you've got a million processors or a million cores and 100,000 processors, they have to communicate with each other, and the network is what limits performance nowadays. It's not the processor itself. It's the ability of the processors that talk to each other. I'm not clear yet which particular modalities and which architectures scale better. So to me, we're not going to reach a single chip that has a million qubits in it. We're probably going to have 1,000-qubit chips, and 1,000 of them communicate with each other. That's going to be the architecture of choice in my mind.

So, you have to look at which modalities do not offer the best single qubit but offer the qubit modality that allows you to scale in an interesting way. Because if you've got 1,000 chips and they're all quantum, if you can't communicate effectively in the quantum realm and you have to drop down to the classical world to run communications, it's dead; you're dead in the water. That's a particular problem. The other one that I think is important is I/O. It's difficult to communicate with a quantum computer, so there are all sorts of instrumentation and control processes that are all classical IT, but the architecture needs to support that.

How can I talk to that quantum processor in a timely way? Give it lots of instructions because the wonderful dichotomy of quantum computing is most of the time, you don't want to touch it. You want to maintain that quantum goodness. So you don't want the outside environment impinging on it at all except when you do. When you want to get in, and you want to shove data down there, you want it to do with quantum goodness, and then you want to get the data out. So half the time or some percentage of the time, you want to leave it alone. The other half, you want to be all over it so you can communicate with it.

I/O I think is another vexing problem because right now, it takes an awful lot of control lines to control every qubit. Now, a typical transistor has billions of transistors, a typical microprocessor, an advanced one. Billions of transistors, not billions of I/O lines. It'd be strange to have a laptop that had that. The quantum world must deal with the fact that you now have to figure out how I get complex signals into a qubit or quantum processor without having too much onerous overhead in the I/O capabilities? So, scalability and I/O, I think, are the next big technical challenges from a hardware perspective.

Software perspective, demonstrated use cases. I'm waiting for the people to come along and say, "We just saved X billion dollars this year because we figured out an optimization routine for loading cargo on an aircraft, and now we can do it more effectively in real-time, and this just saved us X amount of dollars on flight delays and fuel costs and such." More demonstrated use cases, but that someone, a budget director, can bring to the C-suite and say, "You don't want to hear about quantum. You want to hear about the fact that I just saved you $50 million this year." And that's from the corporate side, that I think is the next step, the next trend that I'd like to see happen more often.

Yuval: Excellent. So Bob, how can people get in touch with you to learn more about your work?

Bob: Well, Hyperion Research, we do have a website, and we're pretty out there. And my name is Bob Sorensen, so if you just type in bsorensen@hyperionres.com, I'm there. And the interesting thing is, because this is such a new subject I have this wonderful responsibility of not only doing things that keep the lights on in terms of payment, but also consulting and talking to people, and just gathering information and such. So in many cases, all I like to do is just chat with folks to hear their hopes, and their dreams, and their fears so I can present a more accurate portrayal of how the sector's unfolding.

So it's really more about having conversations at this point than it is me selling a glossy brochure-driven document set at this point. So reach out, contact. I have a number of decks. I'm more than happy to answer questions and give presentations going down the road. But with that said, something like this to me is just fascinating in terms of getting the word out, and I appreciate the invitation today. And, that you actually let me talk, I think, for over 20 minutes, which shows how absolutely patient you can be, Yuval.

Yuval: Well, it was my pleasure having you today. Thanks very much for joining.

Bob: Great. All right. Thank you.


About "The Qubit Guy's Podcast"

Hosted by The Qubit Guy (Yuval Boger, our Chief Marketing Officer), the podcast hosts thought leaders in quantum computing to discuss business and technical questions that impact the quantum computing ecosystem. Our guests provide interesting insights about quantum computer software and algorithm, quantum computer hardware, key applications for quantum computing, market studies of the quantum industry and more.

If you would like to suggest a guest for the podcast, please contact us.

See Also

No items found.

Start Creating Quantum Software Without Limits

contact us