Transcript | Monte Carlo Simulations and Other Quantum Use Cases

Monte Carlo simulations have been around for quite a while in classical computing, and help companies model “what if scenarios” such as how pricing might change over time. Since 2015, the quantum computing industry has believed Monte Carlo simulations would be one of the best use cases to show quantum advantage. QC Ware was able to prove the feasibility of this approach recently, working with Goldman Sachs. And now they want to share that field-tested approach, along with other use cases, in a powerful tool.

Guest: Yianni Gamvros, Head of Business Development at QC Ware

Konstantinos

Monte Carlo simulations have been around for quite a while in classical computing. Since 2015, the quantum computing industry has believed they would be one of the best use cases to show quantum advantage. QC Ware was able to prove the feasibility of this approach recently working with Goldman Sachs. Now, they want to share that field-tested approach along with other use cases and a powerful tool.

Find out how to get access to proven reusable code in this episode of The Post-Quantum World. I’m your host, Konstantinos Karagiannis. I lead Quantum Computing Services at Protiviti, where we’re helping companies prepare for the benefits and threats of this exploding field. I hope you’ll join each episode as we explore the technology and business impacts of this post-quantum era.

Our guest today is the head of business development for QC Ware, Yianni Gamvros. Thanks for joining me.

 

Yianni

Thanks for having me. I’m glad to be here.

 

Konstantinos

Do you want to start by telling my listeners a bit about QC Ware?

 

Yianni

QC Ware is a software quantum computing startup. We build software that runs on quantum computers. The whole value proposition for the company is that we will help companies get ready for the potential upcoming quantum computing disruption. Quantum computing has the potential of disrupting everything we do today in data science, everything we do today in AI and ML. There is already a lot of excitement that’s starting to build up around this topic, around which use cases are going to be affected, and how exactly the different companies need to be prepared. QC Ware’s mission is to help these companies get ready for that upcoming disruption to figure out what parts of their businesses are going to be affected and how to best make use of these quantum computers.

 

Konstantinos

I was looking into you guys, and I noticed that in addition to the work you do, which I’m going to get to in a little bit, you provide a way for people to do some of this themselves. Do you want to tell me a little bit about Forge?

 

Yianni

Forge is one big way that we go to market by building this software product called QC Ware Forge — the other one being our consulting services, which maybe we can get to later. Forge itself, the easiest way to describe it is that it’s like a data science platform that runs on a quantum computer. Anything that you would do today on a data science platform — let’s say you’re running a classification algorithm, running a clustering algorithm, running a neural network, you could do on Forge today, but the algorithm behind the scenes that runs on the platform is a quantum computing algorithm. This is an algorithm that can run on a quantum computer, and you have the option on this platform to either simulate on a classical computer what a potential quantum computer would be doing, or run on real quantum hardware.

You can do either. You provide the same inputs you provide to a typical data science platform. A data scientist would come up with a data set, with a table, and let’s say what you’re trying to do is a classification or clustering of your data points. You’re providing this to Forge, and then selecting one of the algorithms — one of the APIs that’s available on Forge — and then you will also be selecting an appropriate back end and saying, “I want this simulated” or “I want this to run on a real quantum hardware.” Once you specify all that, Forge takes all that from you, and then it does a simulation or runs on the quantum hardware and returns the result.

 

Konstantinos

One thing that immediately leaped out at me was this idea of no prior experience necessary for quantum computing. Are you then providing abstracted building blocks to create a circuit?

 

Yianni

The way that I explain it is that we have two main types of users. The one type of user for which the workflow is the one that I just explained, the classical data science user, who doesn’t really know or doesn’t care so much about what happens under the hood. 

In the typical workflow, the typical data science user doesn’t necessarily know how, let’s say, a clustering algorithm works, doesn’t necessarily know how a classification algorithm works. They just want to have a good implementation of one or more of these classification algorithms or these clustering algorithms. They provide their data set, they get back a result, and maybe they want to tune the algorithm with some parameterization. Maybe they want to tune their data set and try to make the algorithm work better or come back with better insights, but they don’t spend any time going out and building a new classification algorithm or a new clustering algorithm or something like that. 

That’s the one main type of user experience, and the main type of user, that we accommodate on Forge. However, the other type of user is a little bit more sophisticated, and peels back one layer of the onion. 

For that user, we say, “Well, let’s just assume you have some quantum computing experience. You’re a quantum engineer or quantum developer, and you want to build something that’s highly customizable or highly bespoke for your specific use case, but you don’t want to build everything from scratch. You don’t want to essentially reinvent the wheel. You want to get some of the existing features that we’ve already developed, existing quantum circuits that we’ve already developed.” We make those available to you as building blocks —a quantum code or a quantum circuit or, as we say, building blocks. We give you these quantum circuits, and then you, as a quantum developer, can use that block, then build a little bit on your own, then use another block, and keep going like that, interchanging those things until you end up with something that’s very unique for your problem.

 

Konstantinos

Do you have to select the back-end target before you start doing this drilling down under the hood? Or is that selected later? Does it change the experience when you code it?

 

Yianni

There’s no real difference in how you code it. That comes from the fact that the value that we provide as QC Ware, and the value that QC Ware Forge provides, is in these algorithmic building blocks that are sitting very close to the end user, very high in the stack. Consider the entire stack all the way from the end user sitting at the very top. Then comes this layer of applications and algorithms from QC Ware Forge, then come other middleware layers that typically go out and take an algorithm, take an initial circuit, and then translate it across different potential back ends, and then come the different back ends, the simulation back end, the different hardware vendors and the different hardware architectures. 

We’re sitting at the very top of this stack and very close to the end user. As a result, we can create this algorithmic building block or circuit building block, and then the rest of the stack that we are using, which comes our partners, can disseminate that to the different back ends. Initially, when the user starts coding, they can say, “Oh, I’m going to use this building block, or I’m going to use this algorithm,” and then later on, they can say, “I’m going to define now which back end I’m going to use, and I’m going to get now the results.”

 

Konstantinos

So, you then handle it all the way? It goes from Forge directly to that back-end target? It’s not like you have to take the code and paste it into some other environment or anything like that?

 

Yianni

All of it happens in Forge, but again, I want to be clear that our value proposition is providing you with these end-to-end algorithmic building blocks. Taking it all the way to the hardware comes from us having integrated Forge with some of our partner solutions, and specifically, we’ve partnered with Amazon Braket, and so we send the circuit that we want to execute to Amazon Bracket, and we tell Amazon Braket, “We want to execute this on a certain hardware vendor,” and they complete the rest. That’s the part that they add. That part is coming from Amazon Braket, our vendor, but we are integrated with them, so the user doesn’t have to copy-paste anything. They can just stay in that same environment and same workflow.

 

Konstantinos

So, you create an account, and it ties in with API keys or whatever to what you’re already paying for, and then they can just go with it. 

Is there any kind of way to control that for multiple users —to have controls on how much money they spend or anything like that?

 

Yianni

We started building Forge two and a half, three years ago, and that was one of the big requirements we were faced with right away, because all the customers that are using quantum computer right now are essentially R&D departments and large corporations. Those R&D departments work with many different business units, many different researchers, sometimes external even to the corporation they might be working with — some academics or some other groups. 

So, there was already, from the start, a very strong need to have this being offered essentially as an enterprise solution where you have administrators that are controlling who gets invited into the account — many different people need to get invited, there needs to be some user management, user control from the administrator side, and how many resources can each group or each individual user spend? There’s already also these constraints that the administrator for a certain enterprise account can set on the different types of users and different user groups.

 

Konstantinos

Excellent. I was looking at what’s there now. There’s optimization, machine learning, linear algebra, but you have Monte Carlo simulation coming, and chemistry. I consider simulation to be one of those big three: You’ve got your optimizations, your machine learning and your simulation. Are you then taking some of what you’re learning on the consulting side, from what you’ve done with Goldman, which we’ll talk about in a moment? Are you taking some of that and then adapting it to the Forge tool, bringing it to the masses?

 

Yianni

That’s exactly the whole business model: to execute several of these consulting engagements with large strategic customers and take the lessons learned and distill them into the product and take the product to a bigger market, a wider market, and have that wider market benefit from what we found in those strategic lighthouse engagements.

 

Konstantinos

I feel like I end up asking this often, but now we’ve got what you’re feeding into it as code that people can run, are any of your users also sharing, then? Are they also providing solutions to problems like, especially, the quantum engineers that really drill down? Is any of that then being shared with future people with less experience who come to the tool?

 

Yianni

That’s a good point, but no, we’re not set up for this kind of community building. We’re targeting this for enterprise use and performance use and competitive use. I feel that most of our customers wouldn’t be potentially interested in that, or I don’t see Forge as the right environment to do that. I can definitely see many other environments where there is a need for that to offer some code for reuse, but Forge is not really built for something like that.

 

Konstantinos

If you’re a user, then, you’re getting the benefit of your team’s code that you created?

 

Yianni

Exactly.

 

Konstantinos

I guess that’s definitely a strong value proposition you wouldn’t put up there if you didn’t believe in it.

 

Konstantinos

About that Monte Carlo simulation topic, I think we could shift gears a little to it. Do you want to talk about what it was like setting that up for Goldman Sachs? First, maybe talk about what a Monte Carlo simulation is, if you want to maybe give a little high-level for listeners who might not know.

 

Yianni

Monte Carlo is one of these tools that big investment banks and insurance companies use quite a bit to figure out what is the likely price at some future point of some risky asset or some financial instrument whose price is fluctuating with the market — its future price is really unknown. They run a lot of different simulations under a lot of different scenarios, and then they take an average and a confidence interval around that average. That becomes their informed guess as to what that asset’s price is going to be in the future. 

For simple assets, it’s easy to do these simulations, and you need to do a lot of these simulations to have interesting confidence intervals and interesting averages. For more complex assets, as it becomes harder and harder, you need to do more and more of these repeat scenarios. You need to run more of these scenarios effectively to come up with a good accuracy of your average and a tight confidence interval around that average. 

What typically happens in these banks — and, again, this applies to investment banks and potentially retail banks as well, maybe to a lesser extent, and all kinds of insurance companies — they typically run all these simulations on GPU platforms overnight. These are calculations that typically take hours and hours on end to complete, and then, once the traders come in in the morning to start trading, they have the results from last night’s run. 

The potential value proposition here for quantum computing to come in and address this area is for quantum computing to execute this calculation faster than what the current classical infrastructure can, and maybe can become faster than what they’re currently doing, and to be able to essentially do it throughout the day as opposed to doing it once overnight. With a quantum computer potentially in some time in the future, you might be able to do it many times throughout the day so that the traders have the latest information in front of them at any given point within the day, especially when that day might be very volatile and you might have different things happening in the market.

 

Konstantinos

What kind of hardware did you use as the back end for this?

 

Yianni

We’ve had this collaboration with Goldman, which is public now. We’ve done multiple press releases. I think our first press release we did was back in December of 2019, and then the latest one was a few months ago. When we started working with Goldman on this, it was to prove out that we can work on an algorithm that improves the speed of doing these simulations. Not only that we get faster results— actually, that was result that was already known for some time. There were already researchers that had proven that quantum computers will be able to execute quantum Monte Carlo simulations faster than classical computers.

 

Konstantinos

Yes, like, around 2015, they proved that, yes.

 

Yianni

Yes, but the problem with that result was that the computational resources needed to run that algorithm were so high — the quantum computer needed to actually run that was so large and had such high demands on fidelity and qubit count and everything else — that analysts expected that a quantum computer that was going to be able to handle this result was only going to be available 20 or more years into the future, because you needed a lot of qubits, and you needed a lot of high-quality qubits, to be able to run that initial algorithm. Our initial collaboration with Goldman concentrated on, how do we get the benefit, how do we get the result, that we need while at the same time minimizing the resources needed to actually run this algorithm?

As a result, if you minimize the resources needed to run the algorithm, you can essentially bring the time frame closer. You can bring the time frame for this algorithm to have an impact in the market from 20 years maybe to 10 years, maybe to 5, maybe sooner. 

We started working with Goldman, and the first two years, we worked on the theory of how to get the same results with fewer resources. There was also a trade-off there: To be completely transparent and highlight this for your listeners, we have to trade away some of the benefit to significantly reduce the resources. Instead of getting the full benefit of the quantum algorithm as it’s described in that initial paper, we can get maybe 80% of the benefit by only using one-tenth the size of the previous computer that was required. These numbers are rough — don’t quote me on this. I’m just giving the conceptual example here. 

Once we did that — once we proved the theory — then last year, we also went to one of the hardware vendors. We had this three-way partnership between Goldman and us and IonQ — that was the hardware vendor. We also executed on the hardware to make sure that the theory checks out — the machines are now getting to the stage where they can run some of these calculations — and we get the performance that the theory says we’re going to get. The latest press release that we did with both Goldman Sachs and IonQ highlights the fact that we completed that. We also published the results in a paper that’s on the archive now, and so we have confirmation that we are on the right track.

 

Konstantinos

Yes, and I’ll link all that in the show notes. What’s terrific there is, when that paper was written, the IonQ machine didn’t exist. Algorithmic qubits didn’t exist. What they’re considering good quality — and everything keeps changing, and it’s all about squeezing that extra little bit, as you know, from the stack: some from the hardware, some from software, some from the approach — it impressed me, the way you’re able to throw that on its head and get something usable in the short term.

We’ve seen that before with other things — like with portfolio optimization. There’s always some kind of trade-off to show some quantum edge right now. I hesitate to use the word advantage, because I don’t think we’re quite there yet.

 

Yianni

Yes, it’s definitely not an advantage.

 

Konstantinos

Yes, not an advantage.

 

Yianni

We have to be very clear. I forget exactly the derivative they priced, but it’s something very simple. It’s something that a classical computer can basically do in a fraction of a second. So, you can’t even compare. You can’t even benchmark right now. These things will become relevant once we have a lot bigger quantum computers and they can perform computations that the current classical computers literally take hours to do. That’s when the benefit will become apparent, but right now, we’re definitely not at that stage yet.

 

Konstantinos

That’s, of course, the whole concept of quadratic speed-up. From everything you did, are you feeling that we’re on track for quadratic speed-up, that it’s charting?

 

Yianni

All the results that we’re getting over the last few years are very encouraging, both on the software side and the hardware side. We have these big milestones from the hardware guys being achieved, and we have these results on the software side that keep reducing the resource requirements to get to practical results to advantage.

The biggest thing for me right now is to essentially see whether the hardware guys can stay on the road maps that they have published. Now we have published road maps from IBM and Google and IonQ. It’s important to see how close we can stay to the published road map. If there’s going to be a little bit of a deviation, that’s fine. I think it’s a little bit, but if two years from now, let’s say, we’re still talking about this thing being three to five years out, then it’s not going to be good if that happens two years from now.

 

Konstantinos

Three to five is that magic number I’ve been hearing since the beginning. It’s a good catch-all. 

You partnered with Goldman there on this, and I’m starting to think of them as trying to be second place to JPMC when it comes to quantum innovation. JPMC, it’s no secret, I think — they’re the leader in this space.

 

Yianni

Right.

 

Konstantinos

What was it like having a customer that’s that aware? Did you have this enormous team to work with there that were really plugged in?

 

Yianni

It was definitely easy, on the one hand, because you didn’t have to go with value proposition. You didn’t have to go “Why quantum now, even though it’s far away?” All these questions were answered. They’ve already made the investment. They’re growing their team. It was also interesting for our research team, our own team, that they could just go in there and have to-the-point conversations. They didn’t have to start from scratch and build them, educate them. We could go in and start discussing the details of how to proceed, how to push the boundary. Both on the business side and on the technical side, it was a pleasure because of that high level of the people on the other side for sure.

 

Konstantinos

Excellent. Soon, anyone will be able to play with this exact setup — well, almost. Probably not exact setup, but similar, inside of Forge. Is there a project that’s not quite public yet — and you obviously don’t have to reveal anything, but is there a project that is going to be the basis behind the chemistry simulation, like some real-world cutting of teeth with a partner to back these up?

 

Yianni

You don’t have to read too well between the lines. It’s almost out there if you just go to our website and look at the research we’ve been publishing with other groups. You’ll notice that within the last few months, we’ve published papers with the pharmaceutical company Roche on this biomedical imaging, which is basically image classification. That’s more on the QML side, but we’ve also published a paper with Covestro, the German materials company, where we do chemistry simulation there to discover properties of molecules for materials. We also publish some work with a team from Boehringer Ingelheim, the other pharma company — again, on molecular simulations, and it has to do more with drug discovery, or affinity of drug binding. This is all the papers that are there — I’m not revealing anything new.

 

Konstantinos

When I ask a question, sometimes I know the answer, but yes.

 

Yianni

Yes, exactly. If you see all that, yes, these are the algorithms that are at that stage where we’ve worked with these large customers. We’ve discovered interesting results, now we publish the paper. Then these will make it into the product in due course. I don’t want to provide a time frame on that, but that’s definitely the model: Take these results and distill them into a usable form for everyone.

 

Konstantinos

That’s such a terrific approach. Anything you log in to and try, it’s been done. It’s great. It’s not theoretical.

 

Yianni

Exactly, yes. If you go to the Forge website, you can see under pretty much every feature, we say which paper this feature is based on so you can read — there’s a link on the documentation, obviously, but there’s also a link to the paper that describes, “Well, what’s the advantage of using this feature?” So, we have, for example, a feature that talks about doing linear algebra on a quantum computer — calculating distances and calculating matrix multiplications. On the website, we say, “This is the feature. This is the paper that exposes why this feature is going to be faster on quantum computers rather than classical computers.” Everything that we do comes from fundamental research that we’ve done either on our own, or we’ve done with these large corporate customers.

 

Konstantinos

It is a good way to lay it out for everyone to see. One last thing I wanted to ask you: You’ve done these use cases. You’ve obviously tried to figure out what the impact will be in time given more power. So, if we stay strictly in the gate-based world, what do you think will be the first application, in your opinion, that might actually be able to earn that advantage? Everyone seems to have a different opinion, so I figured I’d get yours.

 

Yianni

Most people think it’s optimization. Are you asking in the context of finance?

 

Konstantinos

Yes, or any one of the use cases that you’ve tried in the gate-based world. In annealing, I feel that’s optimization, for sure, but in gate-based, what do you think it’s going to be?

 

Yianni

It’s definitely going to be something in chemistry simulation. We’re convinced of that. The teams believe strongly in that, and we expect that there’s a chance that if the hardware stays on its current course, we might be able to do something in two to three years — have a very specific use case that demonstrates for that specific use case. There’s not going to be a very generic result where it’s going to be, “Put your molecule in a quantum computer.” You’re always going to get better results or faster results, but for certain molecules — certain properties of molecules — you will get that initial use case and that initial instance problem, instance in chemistry simulation, and we do expect that to happen in the next two to three years.

 

Konstantinos

We will start boning up on chemistry. Yes, that’s terrific. Thank you very much for joining, and I’m excited to see how this evolves.

 

Yianni

Thank you so much. Thanks for having me.

 

Konstantinos

Now it’s time for Coherence, the quantum executive summary where I take a moment to highlight some of the business impacts we discussed today in case things got too nerdy at times. Let’s recap. QC Ware’s team is developing practical quantum use cases based on original research as well as other scholarly papers. This approach lets the company prove the effectiveness of algorithms and applications in the field, and then offer them to customers as ready-to-use building blocks of code. 

These applications become available through a tool called Forge, which has enterprise-class features and should make development easier for those new to quantum. QC Ware seeks to let data scientists do what they are best at without worrying about which quantum computer they’ll be sending their new code to on the back end. One of these cases that generated some press for QC Ware is Monte Carlo simulation. These have been around for a while as a risk tool on classical computers and let companies run simulations of how pricing might change. 

Since around 2015, researchers have believed that quantum computers would one day be able to perform Monte Carlo simulations in a square root of the time taken by a classical system. QC Ware has been able to prove to Goldman Sachs that this will be the case as quantum computers becomes more powerful. Reusable code for this approach will soon be available in Forge, where it will join other field-tested use cases already in the tool. Despite the press around Monte Carlo simulations, Yianni thinks the first advantage in gate-based quantum computing will come from chemical simulations. This is a nice full circle to 1981, when Richard Feynman proposed a quantum computer to simulate the quantum universe that we live within. 

That does it for this episode. Thanks to Yianni Gamvros for joining to discuss Forge and the use cases QC Ware is tackling. Thank you for listening. If you enjoyed the show, please subscribe to Protiviti’s The Post-Quantum World, and leave a review to help others find us. Be sure to follow me on Twitter and Instagram @KonstantHacker. You’ll find links there to what we’re doing in Quantum Computing Services at Protiviti. You can also DM me questions or suggestions for what you’d like to hear on the show. For more information on our quantum services, check out Protiviti.com, or follow Protiviti Tech on Twitter and LinkedIn. Until next time, be kind, and stay quantum curious.

Loading...