The cryptographic apocalypse could be under five years away, depending on interconnect technologies that might allow quantum computers to work together. Is now the time to start planning for, or even implementing, post-quantum safe cryptography? It all depends on the shelf life of the data you’re trying to protect. Join host Konstantinos Karagiannis for a chat about post-quantum cryptography you can implement today.
Guest Speaker: Denis Mandich, CTO at Qrypt.
The Post-Quantum World on Apple Podcasts.
Quantum computing capabilities are exploding, causing disruption and opportunities, but many technology and business leaders don’t understand the impact quantum will have on their business. Protiviti is helping organisations get post-quantum ready. In our bi-weekly podcast series, The Post-Quantum World, Protiviti Associate Director and host Konstantinos Karagiannis is joined by quantum computing experts to discuss hot topics in quantum computing, including the business impact, benefits and threats of this exciting new capability.
Could the cryptographic apocalypse be under five years away? Depends on whom you ask. But there are potential technologies like Interconnect that could speed up the race to 4,000 quality cubits.
Is there something you can do today to protect data with a long shelf life? Find out in this episode of The Post-Quantum World. I’m your host, Konstantinos Karagiannis. I lead quantum computing services at Protiviti, where we’re helping companies prepare for the benefits and threats of this exploding field. I hope you’ll join each episode as we explore the technology and business impacts of this post-quantum era.
Our guest today is involved in post-quantum cryptography, and as you guys know, when it comes to my view of what post-quantum means, it’s a proactive and a defensive kind of thing. We’re looking for quantum use cases all the time, but then there’s this idea of a quantum apocalypse: “Oh, no, all the encryption will break!” — that kind of thing. Our guest today is CTO of a company called Qrypt, and I know him from the Mid-Atlantic Quantum Exchange — we’re on a crypto group together — and I figured it would be great to have him come on and talk about what his company is doing and where post-quantum cryptography in general is going. With that, I’d like to welcome Denis Mandich. Thanks for coming on.
Thanks, Konstantinos, for inviting me. It’s great to have opportunities like this to chat with people who really get it and are looking forward to what we do about this coming quantum apocalypse. Probably five or six years ago, no one really knew anything about quantum computing, and certainly not about what we do about standards on which computing and cryptographic resources are based that will be made obsolete very quickly.
Fortunately, the government was thinking about that a long time ago — probably 10 years ago, when you probably got into the field. And it wasn’t until 2015 and 2016 when the NSA unilaterally announced that everyone should forget about transitioning to this current generation of crypto that we’re using and start thinking about post-quantum crypto. Then this competition began and a new set of algorithms was proposed, and many of those algorithms didn’t work through the first and second rounds — they weren’t fast enough, there were flaws found in them — and the surviving ones are the ones that we’re looking at today as being instrumental to this transition that will start next year.
That’s where Qrypt comes in. Qrypt has looked at this suite of algorithms that are available and built software around the best ones — the best-in-class that we found — and the systems that support those algorithms to make them usable in modern infrastructure and cloud infrastructure. One of the earliest pieces was starting with the premise that we need good-quality random numbers to do all crypto, never mind post-quantum crypto, which has obviously much larger key sizes. As data networks expand, they get bigger and bigger. We have to generate more and more keys. And there just isn’t enough entropy in modern servers and laptops — and certainly handheld devices — to generate high-quality keys.
Historically, this was an issue going back 80 years. The U.S. government made it a very big priority to harvest data that they could over airwaves or diplomatic channels and so on, knowing that they could break it one day by finding a flaw in either the implementation of the cryptography or the random way that they used to make the keys for their crypto to work, and that was called the Verona Project. It was not known until probably the ’80s or ’90s that that’s how the Manhattan Project spies, the Rosenbergs and so on, were discovered — through breaking that crypto, that harvest-and-decrypt-later model.
Today, it’s become so much easier, and when I worked for the government for two decades and saw the scale of IP theft in this country — getting commercialised elsewhere and putting American companies out of business — we looked at the problem: Why didn’t these companies just encrypt their data and make it meaningless that it was stolen because it can’t be decrypted? But one of the things we saw some of the nation-state actors doing was prioritising the theft of encrypted data over the easy-to-get unencrypted data, knowing that it was likely more valuable. But if you did everything right, it would never be decrypted — not by regular computing systems.
But they were always looking ahead, a 50- or 100-year plan — that “This stuff might change the economy of the world one day if we can break it.” And with quantum computers, the earliest ones, only coming online back then — P-Wave and other systems — it was unlikely for that to happen for many years to come. But then Google, IonQ now and many other companies have surprised us, so we really need to prepare for that right now. It’s not a question of if; it’s when. The time scale keeps shrinking down from whatever it was 50 years ago. We talked about this 10, 15 years ago down to 20 years and then to 10, and now we’re in that three-to-five range, seeing IBM coming online with a thousand-qubit machine in the next few years.
If you have four 1,000-qubit machines, all of a sudden, you’re knocking on the door of RSA, so it could be some sudden changes coming. You talked about entropy, and I know you’re pushing this idea of entropy-as-a-service, right?
Yes. That’s a problem, because to generate random numbers is a very hard thing to do, and the systems that we have today we know are flawed. Research has been divulged from the private and public side. We’ve seen some of the ones that have just come out in the last few years showing the work: One in every 200 or so digital certificates shares a key, one of the prime factors in RSA for generating the public and private key pairs. This has long been known — a problem that has been exploited by many groups — so we couldn’t do what we’re doing today with post-quantum crypto software and algorithms without making quantum entropy sources that create massive amounts of random numbers from true quantum processes.
You’re a physicist — you know the only source of random in the universe is actually quantum measurement. Although we use the term chaos as synonymous with random or disorder in colloquial language, it’s not. Chaotic systems, the kind that we harvest entropy from now in computers, is really very well mathematically modeled systems. We can predict from one second to the next where the system is going to be. Maybe not a thousand seconds from now, but that’s not really random. It’s predictable; it’s deterministic.
When you get to that quantum understanding of developing systems that could produce random numbers that cannot be predicted, — we will never have the same two digital certificates issued, we will never have the same AES keys issued ever — but we’re not seeing that in systems today. So, we put together entropy-as-a-service with some of our partners around the world. We’ll probably show off another system the next month or so with a global telecoms company and make that available for all of your cryptosystems for high-performance computing, for Monte Carlo simulations for gambling. All these things that really require some element of trust in the mechanism that you use to make them does not come from one of those deterministic processes.
In a way, I see you guys as defenders of the second law of thermodynamics: You’re guaranteeing that entropy will increase in our universe as it pertains to consumers.
Yes, you absolutely can. I mean, it appeared here first, so why not? Go for it.
Entropy is a poorly understood concept for everyone other than nerds like us in the physics and math community. We’ve seen these systems before that have been used for more than a decade turn out to be actually very nonrandom and highly biased, meaning that whoever knew what the issue was — and this goes back to deliberate attempts at this. In the U.S., in the standards, we had something called Dual_EC_DRBG, which was an alleged attempt by the government to insert a type of kleptographic backdoor where they know what the flaw is in the mechanism that creates the random numbers. So, the spectrum isn’t actually 2256. It’s 28. If you know what those 28 keys are, you can guess them by brute force very quickly, but nobody else can.
So, this eliminates that risk when quantum computers are coming online and they could discover those flaws very quickly. So, you really need quantum something to counter the quantum threat, and the basic, fundamental, foundational solution is, you need quantum number generators at scale — the scale of content-delivery networks you see around the world today, enterprise systems that encrypt massive amounts of data, every TLS exchange on the internet and so on.
We’ve talked about this in our group too — this idea of two ways to approach post-quantum crypto. It’s either physics-based or math-based, to simplify it. You guys, it sounds like, are doing both in some ways, right?
Absolutely. For us, we could build a QKD-like network, but there are easier and faster ways to do it, and honestly, the future is both. We have to have both of these systems in place. We see China is now building thousands of kilometers of QKD trusted node network. That’s not great for doing global communications. If I want to send a QKD key from here to across eight hops to someplace in Europe, I would need a single satellite or some other trusted node to do that. It’s not really practical for most applications. It is for high-security cases, but not all. For us to build the systems on top of that — put quantum out there for everyone to use, just use our APIs, plug it in just like the way you use the Google map to API to your application, and then making it quantum secure with our algorithms, our libraries that we’ve all done —there’s no secrets inside there. Everyone can come kick the tires on how we did it.
We are very much against some of the security by obscurity that we see, especially from some global vendors that claim, “Hey, we’ve got quantum inside here. Trust us — we generate them there.” Well, our argument is that if you don’t disclose everything, no one should believe you, so we disclose everything. Everything we’ve done is in a peer-reviewed public journal article that anyone can read, came out of national labs in the U.S. and Europe. These are published scientists who are not going to have any of these fiascos like we saw at DEFCON last year, where companies were claiming things that are just absurd. We don’t think that’s good for the industry. We don’t think that’s good for anyone. We have to have trust in the system because it’s the foundation of our economies at this point.
So, in trying to roll this out, how would it compare to, let’s say, Quantum Xchange? We know we have this box and they’re trying to combine everything — they have QKD, they have the NIST finalists in there. How would you say you guys compare?
Yes. When I was back at BT, we did the first experiment where we were able to send QKD over a dirty fiber, and we were able to do it at first for 20-some kilometers and then a little farther. Where the limitations were, we were only able to get something like 1.2 megabits per second of onetime pad data, so that really wouldn’t be sufficient to encrypt something. We used it with AES countermodes to constantly update keys and then prevent attacks that way. So, how do you view the future of onetime pad?
It is the future. It’s the ultimate goal on the horizon. Anything that claims to have that same level of security looks exactly like it, so it might as well be it. Obviously, the problem with the onetime pad is — well, QKD solves that if you can get the bit rates higher. If I can get up to terabits per second of QKD network infrastructure, now I’m at the end of the line with crypto. I don’t need to do anything else. I have physics inside. All the keys were generated purely randomly between the endpoints. We have key agreement. I send everything in the clear after that. I can publish it on the internet. Everyone can download it and harvest it as much as they want — it doesn’t make any difference. That is absolutely the end goal where all this is going at some point.
Quantum Xchange and Qubitekk and some other folks are already offering that. Again, it’s the lower rates, but if your QKD network is offering 24/7, 365, you’re generating that much key material at those endpoints year-round, all the time, to make it available as you need it, stockpile it until it’s used for transmitting a file or a message or whatever it may be. So, there’s other ways to do it. One of the fallacies about onetime pads is that it’s cumbersome to get them around. Previously, you needed to physically deliver that mechanism from one point to another to get key agreement, just like how QKD works. But today, with cloud infrastructure, there are other ways to do it, and those will be coming online in the next year or two, where I don’t ever have to send a onetime pad from one point to another. I simultaneously generate onetime pads at two different endpoints, and I get the agreement in a different way, and you’ll see more of that coming online.
If people would try to engage you guys now, what kind of solutions would you give them today? Just your beta?
Yes. The beta, for people who are trying to be crypto-agile, prepare for what’s coming in the future. One of the fears that people have is, if I implement one of these post-quantum cryptographic algorithms into my existing applications, it’s network-crushing, it’s increased latency, it’s going to slow down my applications performance, all those things. That’s not true. You can’t take things out of the box and just plug them directly into your existing applications, because many things won’t work. For example, if you’re trying to do end-to-end encryption and you want to send a thousand messages to your newsgroup, that will fail because that mathematically has not been solved as a problem. How do I send to a group and then encrypt the messages to many people, not just Alice and Bob?
For us, we’re making beta available to our partners. It’s a service, so you can take our APIs — if you want to try generating quantum keys with your classical encryption system, you can now do that. You don’t have to rely on the weak or poor or nonexistent entropy virtualised environments in the cloud. We know all those weaknesses — they just don’t exist. Everything is almost always pseudo random number generation, which is deterministic.
User entropy as a service — plug it in with our APIs and test it on something, whether it’s a simple lottery or a voting thing that you might want to use it for. At the next level-up, now you can incorporate our post-quantum cryptographic libraries, which are optimised on the current generation of processors. They integrate perfectly with our entropy-as-a-service. You have the raw materials, which is the random numbers from our cloud service. You now have our libraries, which incorporate those quantum random into these post-quantum cryptographic algorithms that you can embed like any at the library. With the current level of applications, you’ll see that they work. We’ve taken the ones with the most strict parameters, the worst-case-scenario ones, to make sure that they work. And everything underneath that will certainly work.
We’ve shown that it’s absolutely as fast as anything else that you can do — again, using our versions of all these libraries. So, we’ve built applications where you can pass down real-world examples. If you want to do post-quantum secure messaging or a file transfer, we’ve already built that application, and it incorporates our entropy-as-a-service, our post-quantum cryptographic libraries, and you can test it the same way you use WhatsApp. But there are absolutely huge differences in the way we’ve done it from what you’d find in the street today. We can talk a little bit more about that here if you’re interested.
So, in general, just to take a step back for everyone, with the NIST finalists, we’ve been watching this evolution for a while —obviously, both of us have been keeping a close eye on this. Round one, we had things like BIKE and SIKE, and the performance wasn’t that great. There would be multiple round trips required for a handshake — stuff like that. Even in Round two, that improved greatly — sometimes the times were cut in half, the latencies were reduced, and we’re moving on now. Are you guys keeping a close involvement with everything that’s a current finalist? There’s this model where some companies are just, whatever the latest finalists are, they’re keeping them all in the box, so to speak, and then the idea is, you could delete them as needed as you go along. Is that the general idea for the libraries?
Not for us. We took the extreme view that some of the weaker ones will fall, as they have —not that we predicted the future very well but we started this years ago, and we saw some of the early problems with the ones that are no longer in the competition. We thought one of the biggest issues at the time would be the security parameters. Try to build your libraries and any applications around the worst-case-scenario ones — the ones that if these fail, everything else fails. Then you’re going to be good. You’re going to survive this competition. You’re not going to have to put things into your libraries and your code that are slowing everything down. You only optimise for the things you want to do the best, and put everything else away.
I’m not suggesting anyone else should do what we were doing — meaning, if you want to use SIKE and BIKE and everything else that’s in there today and make that an option, that’s fine. But why not take the strongest one? If that doesn’t work, then ratchet down to something that’s possibly a lower latency, and so on. We’ve proven that this works for big files. It works for big messaging infrastructure that you use every day. We’ve tested it on voice and video. There are performance issues there. Those things have a small impact on it, but over time, that will evolve and go away. Our current generation of processors have things like instruction sets for AES. They don’t have that for this post-quantum cryptographic algorithms, which traditionally take up, on the asymmetric side, way more computational resources than the symmetric ones.
Yes, agreed. I’d love to hear more about how you guys handle that in the application. One big question, then, would be everyone’s concerned about crypto agility, of course. Do you guys consider yourselves crypto-agile — and this isn’t really just you guys. It should be everyone. Whatever you settle on, if that does have a weakness, would you be able to, in an agile way, implement one of the alternatives?
Absolutely. Yes, our version of that is performance. We have taken the ones that are most universally accepted as being the last to likely fall and said that if the performance is good on those, we can make the other ones available if there’s a compliance issue — say, HIPAA compliance says I must use x, y and z algorithm, of course we can absolutely do them. We’re very confident in the performance of those then. We would not be confident if someone asked us to do what we’re doing today and we had implemented one of the weaker protocols.
Yes, that would make sense.
I would be very worried about the performance. There’s bigger issues with all these systems. Let’s say you did everything right with post-quantum crypto — you’ve implemented quantum random number generators, built a keys generation mechanism. There’s still other things that you have to do to be sure that you’re actually secure. Otherwise, it’s just more security theater, which we see all day long in some of these other tools throughout there. In our application, we changed the way we think about doing that.
For example, if we’re on Teams or WhatsApp or Telegram or whatever, we can have a dozen different sessions of the same messaging channel open at one time on my phone, on my laptop, on my desktop, on my tablet and so on. Well, that’s not really secure, because if you know that there are five different versions of your same discussion between Alice and Bob, then how do you know there aren’t six, seven and eight: one at KGB headquarters, one at the NSA, one that Facebook is data-mining for your advertising profile and so on? We shut that down as another demonstration of how you do this the right way. We’ve made a black tunnel from Alice to Bob. There’s only one version of that communique.
You want to do that same conversation on your laptop? You hand over that channel from your desktop to your laptop, and the other one dies, including all the traffic that was in it before. Now, you know that you have an absolutely trusted link between Alice and Bob that could not have been eavesdropped. It has perfect forward secrecy, it is quantum encrypted, all the things that you want in it, but all the harvesting in the world will not affect you at this point. Whereas, in these other cases, as we know, when duplicate copies of file transfers or messaging or whatever happened, it’s a permanent vulnerability that you can’t get around.
Yes, and so people understand here, there is a real value in sending information in a post-quantum safe way now. Certain types of information have a long shelf life — something like any kind of medical information, anything of that nature, that could be around for a long time. Government secrets, trade secrets. There’s no need to worry about post-quantum crypto so much for a credit card transaction maybe right now, because very few people have the same credit card number for five years, but it is more critical with this information that can sit a long time. The recipe for Coca-Cola, for example — you can be sending that with post-quantum crypto. Just understand that there is a value right now in using a service like this if you have to send information. How do you feel about the whole idea of hybrid post-quantum? This idea has been around a little while too.
I’m a proponent of that. Anytime you add a layer to your cryptographic security, you’ve improved the situation, assuming that it’s not one of these Band-Aid-type solutions. I’ll give you an example: If I go on my phone, I have multifactor authentication on my phone, but it’s for my email, which is also on my phone. There are technically two factors to it that really do something out of band — things like that. I’m not mocking multifactor. I’m just saying that when you add security features on top of security features that are orthogonal, they support each other. You might break one but not the other — that’s great. If it’s more of the same, and it doesn’t really improve the situation for you, and that needs to be looked at by experts, then you shouldn’t bother doing it. There’s no point in increasing network traffic and latency and performance, all these things on your network, if it really doesn’t do anything.
It’s the example I gave with different versions of the same channel that you’re communicating, and existing everywhere. Yes, that channel might be perfectly secure from endpoint to endpoint, but if there’s another version that’s listening in on there, and it’s a full copy of it, you’ve really not done anything. It’s the same thing if I told you that “Hey, use our messaging application for all your communications. I guarantee you a hundred percent that the channel is secure, it’s post-quantum secure, blah, blah, blah,” but I’m going to data-mine everything that you type in at the client side. Well, I’m looking at what you do the same way that you see what you do. I’ve not provided you any real security, because I’m doing the same thing. I’ve just protected you from a third party, not from myself.
From what we built, it’s a trust-no-one model. We make all these tools available. We make sure that nobody has a copy of your keys, because we have no idea what you did with the entropy when you got it. We show you a quantum-secure way of combining it to make your own keys that we have no visibility into, even if we saw everything that you did. Other companies are not set up for that. We’re set up for quantum-cryptography-as-a-service so that you do whatever you do with your data, however you want to do it. Trust that we did the security piece right. Everything else — the performance, what you actually do with it, some files, some videos — it doesn’t matter to us. We have no visibility into that. That’s the future going forward. If these companies are really out to provide services to us and not make us the product by data-mining us, then that’s what they’ll do too.
Yes, and because so many people access cloud services through, obviously, the internet, I think it’ll be important while we’re waiting for all these finalisations on your ciphers for something like hybrid post-quantum, where you have, let’s say, regular TLS, but you also have a quantum algorithm, and then that’s two shared keys. And then, if one becomes vulnerable to attack, you still have some kind of protection, so to simplify that handshake. Yes, something like will help bridge the gap, because when one thing falls, they both haven’t fallen, necessarily.
There’s a case to be made for forcing this on everyone — protect them from themselves. We still see these things used in the wild. These technologies are known to be flawed, broken, vulnerable, whatever. They were sunsetted a long time ago, but people who should know better are reluctant to transition to the newer, more secure systems unless they’re forced at gunpoint to do so. If you deprecate them in TLS, say my browser — Browser X — only uses these three algorithms, the latest version, everything post-quantum, and then three years from now, anything that’s pre-quantum will no longer work. Transition now or two years from now, and then be ready for that moment. If you put that hard line in the sand for people, they’ll do it.
Yes, we do have to lead the masses to security sometimes, don’t we?
Absolutely. Protect people from themselves. It’s surprising how much people trust cloud companies’ apps that you downloaded from God knows where — the iTunes store or wherever — to not do something nefarious, and time and time again, that happens.
We really need to protect the consumers. We really need to protect our economy from what’s happening today. You know the IP that’s been stolen from the United States at this point? This has never happened in the history of the world. The wealth transfers from one country to another — in physics, we talk about phase transition. There will be a point at which so much of that has already happened and no more needs to be stolen to overtake our economy, and then we become the number two, number three global power at that point. That’s a very real thing that’s happening right now. It’s a little bit of a mystery why a lot of it has not been operationalised and commercialised, but again, they’ll reach a threshold where it won’t matter anymore. So much of it will be gone that we will not be able to catch up.
Absolutely. That’s just an overarching view of the entire QIS field right now. You can’t get into it later. You’ll be really far behind. It would be like trying to get into machine learning now or a few years from now. You’re already so far behind the curve.
I like to compare the two pretty often, and because this is post-quantum crypto we’re talking about — it’s not just quantum computing that might attack cryptography. There are always side-channel attacks, there’s always more implementations, and of course, what’s making a buzz around the industry right now, if you just want to touch on it for a brief moment — I know we’ve talked about it — this paper by Peter Schnorr and the idea that “Hey, maybe RSA is broken, and it’s not a quantum computer that’s going to do it.” I know you guys are looking at that, so I don’t know if you wanted to share any kind of insights or hints about what we might expect to hear soon.
Sure. We’re looking at Schnorr’s paper very hard, and there’s other work that’s been done on related systems — all the side-channel attacks, everything from looking at what generates random numbers in CPUs today. But the thing that NIST has told everyone right from the very beginning about the post-quantum crypto competition is that none of these algorithms have any mathematical proof of hardness. We just believe that they’re quantum-resistant. They are not quantum-proof. All of them may fall at the end of this competition next year, and we may have to transition to something completely new in 2024 or 2025 or whatever. Until we have something that’s the equivalent of the onetime pad — which is the only thing that is absolutely mathematically proven to be unbreakable if the implementation was done correctly, and if the random was generated truly randomly — then we really don’t know.
These mathematicians, these are brilliant people. They’re looking hard at this problem from the classical senses that I might find a solution that’s faster than factor primes or semiprimes. I might find a way to do lattice multiplication or the single-vector problem much faster a year from now. That could happen at any time. We could be surprised by this any day. So, we really have to be crypto-agile and ready to transition away from everything that we’re doing today, even.
I agree, because one of the greatest things about this era of quantum machines is how people quickly started developing algorithms. Of course, we’re seeing them all the time now, and some of them are proof of concept to show that you can get a quantum computer to do something. But a lot of them do solve real-world problems, and it’s a little terrifying to think that there might be a new Shor's algorithm coming — something like Shor's algorithm — that will break one of these solutions. You never know, but it seems like a stretch still. Especially for Bitcoin, I wonder if one day we’ll be able to come up with an algorithm that in the interim speeds up the mining process. That could be fascinating, wouldn’t it?
A stopgap. No proof there, but we’ll see.
This has been really fascinating, and if you guys want to learn anything else about Qrypt, you can go to Qrypt.com and check out what Denis and his team are up to. And thanks again for coming on, and I guess I’ll see you at the next crypto meeting.
Yes. We’ll see you this week. We have a lot of work to do there, and I hope I educate the community like you’re doing today. The more knowledge that we can share that we have is, in some sense, esoteric and specialised, and to get it out to the broader community, it’s just a great public service and good for everyone. There aren’t enough companies doing what we do, so we encourage more people to get involved. Thanks for having us, of course. What you’re doing is great.
That does it for this episode. Thanks to Denis Mandich for joining today to discuss Qrypt and post-quantum cryptography. Thank you for listening. If you enjoyed the show, please subscribe to Protiviti’s The Post-Quantum World and leave a review to help others find us. Be sure to follow me on Twitter and Instagram at @Konstanthacker. You’ll find links there to what we’re doing in quantum computing services at Protiviti. You can also find information on our quantum services at www.protiviti.com, or follow Protiviti Tech on Twitter and LinkedIn. Until next time, be kind, and stay quantum curious.