The fear of a quantum cryptographic apocalypse has been with us since the 90s and has launched quite a few careers and companies. Post-Quantum, no relation to this show, has been developing end-to-end solutions for quantum-secure communications since 2009, and they recently participated in a year-long proof of concept with NATO. Join host Konstantinos Karagiannis for a chat on practical solutions to prepare for the cryptographic apocalypse with Andersen Cheng from Post-Quantum.
Guest Speaker: Andersen Cheng – CEO of Post-Quantum
The Post-Quantum World on Apple Podcasts.
Quantum computing capabilities are exploding, causing disruption and opportunities, but many technology and business leaders don’t understand the impact quantum will have on their business. Protiviti is helping organisations get post-quantum ready. In our bi-weekly podcast series, The Post-Quantum World, Protiviti Associate Director and host Konstantinos Karagiannis is joined by quantum computing experts to discuss hot topics in quantum computing, including the business impact, benefits and threats of this exciting new capability.
Quantum computing feels like a young field, but some of us have been in the industry for a decade or longer.
Fear of a quantum cryptographic apocalypse has launched a lot of careers and a lot of companies. Post-Quantum — no relation — has been developing end-to-end solutions for quantum-secure communications since 2009, and they recently participated in a yearlong proof of concept with NATO.
Find out more about practical solutions to prep for the cryptographic apocalypse in this episode of The Post-Quantum World. I’m your host, Konstantinos Karagiannis. I lead Quantum Computing Services at Protiviti, where we’re helping companies prepare for the benefits and threats of this exploding field. I hope you’ll join each episode as we explore the technology and business impacts of this post-quantum era.
Our guest today is the CEO from a company called Post-Quantum. I realise that might be confusing, because this is The Post-Quantum World, but it is different than this podcast. I’d like to welcome Andersen Cheng. Thanks for coming on.
We founded the company back in 2009, so, from the startup angle, we’re not new, but if you have to do R&D in post-quantum cryptography — and that’s what we focus on — you do have to spend your time doing it. We spent our early years doing a lot of that, and then we did some submissions in NIST and IETF and so on which have achieved pretty good results. In the meantime, we have developed a suite of products in identity and other blockchain-related stuff, but always with a quantum-safe foundation in mind.
I would just rewind to go back to my background, because I trained as a computer auditor more than 30 years ago, so I have always looked at cyber security with end-to-end thinking in mind. Over the years, we have seen a lot of solutions, and they seemed to be pretty much isolated. You can have the best firewall, best intrusion detection, best whatever, but do they talk to each other? Do they link up to each other? I spent a long time trying to think about what would be the perfect ecosystem. Then my cofounder and I were thinking about “Before we can even do that, we have to think about the public key cryptography that we use today — namely, our RSA elliptic curve. They’re not quantum-safe. When a quantum computer comes into existence, then we can just forget about it, because everything would be broken.
It was on that basis that we started doing a lot of heavy R&D on how to make things work, because public key cryptography is used in everything, whether it’s on this video call that I’m having with you, on a wireless chat I have on my iPhone — we use it. Now, if you want to come up with the best solution, I’m sure a lot of your audience is already aware that NIST has been running this competition for the next-generation PQC standards. We submitted our proposal back in 2017. We were one of 82 submissions. Now, hopefully, NIST is going to announce their final standards any day now in April. They have three finalists left in lattice and one left in code-based, and that’s where we’re focused on. That’s the background on our thinking behind public key cryptography.
You also mentioned, in terms of practicality, there is a difference between us and our peers or competitors because a lot of our peers, they are academics — they focus on optimising the mathematics — but we are all ex-engineers. We look at the real-life problems to see whether we can come up with some solutions for it for real-time and real-life use rather than some mathematics, which can have all the glory — but whether you can put it into practical use, that’s another matter. That’s the NIST competition.
We also submitted a Hybrid PQ-VPN proposal to IETF which formed the foundation of the next-generation standard in secure connectivity. On that one, it’s interesting, because I can tell you from experience, it’s been a bit of a struggle for us. That’s probably an understatement, because I was always the only person shouting from the rooftop back in 2009 and 2010 and so on, saying, “Quantum is coming. That really will be the end.” People were all laughing at us, saying, “Let’s worry about it post-quantum.”
I coined a number of terms which are now widely used by the industry, because even if they’re skeptics thinking it might be 10, 20 years away, how about what we call the “harvest-now and decrypt-later text,” because now there’s abundant evidence that certain adversaries are diverting the internet traffic to certain Eastern European countries or even Russia for two, three hours at a time, and then they are back to normal. We have to do something about that.
Now, a little bit on the VPN. Obviously, our recent conclusion on that experiment with NATO has attracted some attention. The reason it came about was, after our submission to the IETF, it caught the eye of certain departments, and we did the crypto libraries for that, and then it caught the eye of NATO, because we have known NATO for a number of years now. We have been very friendly. We collaborate on several projects. Last year, they approached us, saying, “This Hybrid PQ-VPN — we have to try something out. Do you have something available?” We differ from our peers because most of them focus on putting algos on chips or checking the same boxes, but we focus on enterprise software solutions.
We thought, “Just imagine, if I go to a CISO or a bank CIO and say, ‘NIST has now come out with this new standard,’ whether it’s lattice- or code-based, it doesn’t matter.” If I’m asking you to throw your RSA away today and adopt a new standard, most likely, no enterprises would allow that to happen. If I tell you what if I give you a hybridised solution, belt and braces, I give you the elliptic curve in a tunnel, I repeat the RAM with a PQ adaptor, then I can bring in the various NIST candidates. It doesn’t need to be the final standard. It can be a number of those. When we start to do the handshaking, we can detect. If we’re both still using RSA, we downgrade that to our current primitives, but say if I have upgraded mine, and you have upgraded yours to something else, then we can look and see where the common grounds are, and then we do the connection that way.
It gives people the flexibility on the migration, because quantum migration would take many years. It brings a different kind of characteristics. It’s not Y2K, because Y2Q is entirely different. Let me expand on that one. I was on J.P. Morgan’s Y2K migration committee. Thinking back, it was not too onerous a job, because we had a definite deadline, but unknown impact — no one knew what’s going to happen. The actual project was relatively onerous but simple, because all you had to do was to go through every single module to see whether there was a date filled which would reset itself to the 1st of January 1970, and then you just correct it, and then you move on to the next and the next.
Y2Q is the other way around. You don’t know when it’s coming, but when it comes, the impact is going to be 100%, and you cannot just look at all these in isolation, because public key cryptography is all about handshaking. If I’m now trying to connect between A and B, I may have worked something else out which is perfect, but when you now come into the enterprise, now you start handshaking between B and C, C and D, and so on. If you’re not careful at the beginning, you may have an impact or a different set of parameters which will affect your subsequent performance. That’s the future of it.
I’ve been a fan of hybrid approaches for a while because, like you said, you agree on something, and then you send the information. It doesn’t break anything, but in certain circumstances, it can provide enhanced security. If, one day, a quantum computer tries to attack that protocol, hopefully, the wrapper of, in this case, NTS-KEM would provide that extra layer of security, but it hasn’t broken anything. The system still works.
Amazon is doing something not too dissimilar in AWS. They have their hybrid approach, where you can Kyber, BIKE, SIKE and do the hybrid handshake. Rolling your own cryptography is hard. That’s a security cliché. I’d love it if you could take me behind the scenes of what kind of testing you did to make sure your solution is airtight, and then, also, what NATO might have put you through in this whole yearlong experiment.
I have to rewind the timeline way before last year, because we were doing our own R&D back in 2013, 2014, at that time when the whole world would not believe us. What we did was, “We have to put it on something.” We created a secure WhatsApp equivalent, and then we put our error-correcting code-based approach. At that time, we called it NTS — Never the Same — which has now merged with the submission led by Danny Bernstein, the top cryptographer in the world. The Classic McEliece is now the finalist.
We put it in into our secure messaging app that was back in 2014, and we were able to prove it would work smoothly, day in, day out. One thing people probably do not realise, they all think it’s as simple as swapping out RSA and swapping in something else. I can tell you, definitely not, because through that experience, we learned quite a bit of what we called secondary characteristics in PQC, where they all behave slightly differently depending on the primitive you use.
We were able to face a number of roadblocks back in 2014, and that’s how we had built up our expertise in optimising it. At that time, it was just for code-based, and then through the hybridisation thinking, we were thinking more and more how we could mix-and-match together. Our focus had been very much on that type of — I wouldn’t call it “cannot fail,” because in a prior venture, we were involved in cannot-fail projects, but for this kind of enterprise solution, we were able to do quite a lot of testing on that based on difference of connectivity, whether it’s wireless, whether it’s online through the link or whatever. We did quite a lot that.
Then, during the Hybrid PQ-VPN project with NATO, we provided what we thought was the optimised solution, and then we gave them the crypto libraries, and then they were able to test it over a number of locations, and they were mixing-and-matching different types of PQC primitives. A report was written, and the conclusion was that a PQC would be practical for real-world implementations.
We presented the findings, one of the contents, this last November, and did one at their next conference last week. They concluded that the hybridised approach would work and it can be interoperable, because NATO is an interesting use case, unlike anyone else, because they’ve got 30 member states. It doesn’t take a rocket scientist to work out, all 30 member states will have different equipment using different operating systems. Some of them may take a long time to upgrade, and I may have upgraded mine. We always have to work on the lowest denominators. NATO did a number of tests on that. Unfortunately, some of those I cannot share with the audience, but they were happy enough with the output, and they have now come up with some next steps as well on further optimisation and on how to put it into a huge trial.
For example, on wireless and satellite coms, in theater and on IOT sensors, because IOT is, I wouldn’t say strange, but it’s something that we have to consider seriously as well, depending on whether your sensors have enough processing power and battery power. Sometimes you do not have enough microprocessing capabilities there, so you have to drop yourself down to the lowest denominator for that to work. It’s this kind of situation one has to go through.
Were there any attempts at things like side-channel attacks. or were there any darker projects like that that were done?
Yes. I did a number of those CCA and so on. I’ve got a report. It’s not a classified report. I can ask for permission. If they’re happy, I can share that with you. They did consider a number of those normal crypto stress testings or, as you said, adversarial-type attacks.
I don’t know whether you are on the NIST PQC Forum. It’s quite a vibrant forum. When I say vibrant, it can be a little bit colorful in terms of people — especially, you have all the top cryptographers trying to have a go at each other. Sometimes, when you read their comments, you do learn quite a lot from that, because some of them will specialise in just one area of the attack. Then people will say whether this particular candidate is good enough for that or not. That provides quite a bit of insight into the whole thing.
NIST itself, as an organisation, they have not been very participative in the forum discussion, because that’s more for the world’s cryptographers and relevant people to critique on each other’s submissions. NIST would just more or less be observing on some of that. From time to time, when certain debates get a little bit too heated, then they jump in to mediate.
That’s correct. One thing I would like to highlight: It’s a sacrifice we had to make as well, because certain inventors would come up with proprietary crypto algorithms or even patented ones. We did patent ours as well, but we had to open up. We had to sign away the rights as part of the competition because otherwise, you just will not get the crypto communities to scrutinise it. Then you would think, “Should I just keep everything proprietary and secret and so on?” but if the whole world doesn’t talk, what’s the point? So, we decided to open it up and then to participate.
With the projects you’re working on, there’s also an important thought around levels of long shelf life involved in data. For an approach like this, you probably wouldn’t even try and convince a company to worry about this for just any old information. Did you do any kind of work with customers to help strategise what the long shelf-life data is, like finding out what should go over a track like this, over a wrapper like this?
That’s a very good question. It depends on whom you speak to and the nature of the data — whether you have to keep it for a long time — because a lot of people have been saying, “For government-grade data, you have to keep it for at least 25, 30 years or even more depending on the country. You’d have other data which you should start safeguarding even today, like IP-heavy industries, pharmaceuticals, battery research and all these, where they create a lot of the new thinking. We don’t want people to have this decrypted data and suck out your stuff now, knowing that they can open it up in a few years’ time.
How about ID and healthcare data and biometrics? We have to keep them safe. Increasingly, in the financial services world, people have also started thinking, “In terms of trading data, it can be in the public domain in a few weeks’ or a few months’ time.” How about your asset-allocation decisions? Take BlackRock for example — the largest asset manager in the world. I wouldn’t be surprised if most of these sovereign wealth funds are their customers. If, one day, certain countries — for example, in the Middle East — if all their neighbors find out that the asset manager is giving different biases or different asset-allocation ideas or different fee structure to your neighbors, I’m sure people will not be very happy. That kind of information, you have to keep secret over a long time.
There are reputational concerns too, I imagine. You just pointed out how this data moves. My favorite example is just to tell people, “If you had a secret formula of Coca-Cola, you probably want to be post-quantum already.” You don’t want to be sending that around.
Yes. That brings up an interesting joke that I tell people, but it’s not a joke: I was invited to a trading-technology conference. After my speaking slot, I went back to my area. There was a CTO of a stock exchange, there was a senior partner of a regulator and then there was a head of procurement at a large investment back. They’re saying, “So, how do you protect it?” I said, “To be honest, even today, I insist on all my bank statements to be on paper, because if there’s a quantum attack or a huge hack when everything has gone online or it takes a long time for you to recover backup, at least I’ll be at the front of the queue with some paper statements.” You want to guess what the response was from those three, four guys? They said, “Well, we have a confession to make. We are the same.” So, these guys are keeping stuff on paper as well.
Is the VPN approach the only practical solution you’re looking at right now, or are you trying to implement this hybrid and something else?
Not necessarily, because if you think about it, I have lost count on all the jargon that people have come up within the last 20 years. There’s always something new every month. If you look at your computing process ruthlessly, it’s nothing more than input processing output with secure transmission in between. If you work on that principle, then you can start thinking about how to create your next-generation, future-proof ecosystem. The VPN will provide you with a secure tunnel for the transmission.
At the end of the day, is that sufficient? The answer is, probably not, because if you’ve got the wrong person who comes in, then you can forget about the rest. This is why your identity is normally your most important thing before you can start going into your enterprise flow. That kind of ID solution, you have to think about how to make them quantum-ready today and to become quantum-safe later. If you can imagine, you can have the most solid pipe in the world, and people will start attacking the joints. When you start protecting joints, people will start contaminating the water going into the pipe. Unless you have this end-to-end thinking, it’s not going to work.
Now, if you’ve got your ID quantum-ready with a secure tunnel going into enterprise, then you start thinking about your messaging solution, your collaboration tools, your others. Then you can start upgrading them one by one. This quantum migration will not happen overnight. I was talking to a very top cryptographer a few months ago. He thought it would take 10 years to do. This is why a lot of the consulting firms have been regearing and retraining their consultants to become quantum-migration consultants, because there are already frameworks available from SE, from ERISA, from ENISA, from some of the other bodies. You can follow that kind of framework to start doing your inventory audit, and then you can see which is the most suitable PQC for that.
Yes. You could see this approach being used everywhere that you’re not even seeing regularly. For example, web services, whatever information is being sent on the back end over an API, that’s pretty juicy. That XML is usually a pretty juicy target. It does have a lot of critical data being sent that way. Can you quickly have tunnels there and all other place in the enterprise? It’s funny: With this remote approach we’re all doing to work right now still, or hybrid, at least, in some ways, if they would implement something like your VPN, that organisation would probably be more post-quantum-secure than any other, because everything is from home quantum-secure to something rather than inside this building where no one even knows what’s happening.
Yes. Let me just throw something into the pot, because VPN will give you a very secure tunnel, but now the industry trend is to say, “Maybe we should do SASE or ZTNA” and all these. It’s more ID-related. Then, surely, you must be thinking about to make your ID module quantum-ready as well, because otherwise, it’s not going to work long-term. So, it’s horses with courses, and see whether you can link them together depending on the customer requirement.
One potentially uncomfortable question: What if NIST says, “We don’t want NTS-KEM?” What if they decided at the last minute for some technical reason that we can’t foresee?
To be honest, it doesn’t matter. It doesn’t matter anymore, because we have built our own reputation in the crypto community and in the defense community and so on. I believe I coined the phrase hybridisation back in 2015. NTS or Classic McEliece is based on the McEliece crypto system. I’m sure you know it’s extremely powerful, extremely secure. It was invented more than 40 years ago, and people have all concluded it’s NP-hard. No one can crack it. At the same time, it does come with this longer key size and so on. It may not be usable in every single use case, but our expertise and knowhow through the past few years is no longer built on just one trick. It’s like, “If it’s lattice, what kinds of characteristics are there?” Then we don’t use the different lattice candidates, and so on.
In fact, for the NATO VPN test, we did not even put in our own Classic McEliece, because we knew it would be much better and faster and more efficient to use a mix of lattice and multivaried and assortment instead. It’s that kind of thinking that we have. If people like the German government, even before the NIST standardisation, have already said Classic McEliece is good enough for their purpose — they have already asked their agencies and enterprises, “If you have a relevant use case with that, you don’t have to wait for NIST anymore.” For example, in secure messaging, Classic McEliece is really superior — why would you want to use others? It depends on the use case and the customer requirements.
And then, at that point, you have your team continuing to poke at it and torture-test it over time and be aware of anything that develops with your approach.
That’s correct. Because now, having waited for all the years, I’ll be honest with you — there were several occasions that we were thinking whether we were way ahead of the time, whether people appreciate it or even want that. Should we just forget about it and just retire? But then we decided to persist on it, and now, everyone is talking about PQC. The market is definitely here. There are findings here. Now, because a lot of people have been procrastinating the need to do anything until NIST makes its announcement, I hope this standardisation announcement that they’re going to make in April is going to be the beginning of some form of migration.
I believe the first pot of gold would be made by all the consultant firms. They are the ones who will be able to start preaching to their clients and say, “We have to think about it if you’re going to be —” because everyone’s now migrating to the cloud. If people are doing vendor selection today, then why don’t you have a look at quantum-safe or quantum-ready solutions? I’m based in the U.K., but I know Joe Biden has this 30-year infrastructure-renewal plan. I have lost count whether he’s committed $6 trillion or $7 trillion by now, but I know one of the questions in their RFIs or RFQs is, “What are you doing about quantum?” As you can imagine, if people are going to renew all the highways, all the railways, all the power grids and waterworks, and so on, a quantum attack is very likely in the next 10, 15, 20 years. If they’re doing it now, why don’t they future-proof it?
It’s a very good point. In my company, we help customers figure out how agile they are in cryptography: What needs to go? What is higher-priority? If we were to find some serious flaws and a company wanted to experiment with post-quantum cryptography, they would be able today to try and introduce something like your hybrid VPN?
Yes. That’s right. It will be a very good start so that people can start trying to not fully integrate into their current infrastructure, but they can start trying out and understanding more of the unusual characteristics. Then they know, “If I do it properly, I can create a buffer overflow, so this is why I have to do something else,” and so on. We went through quite a lot of learning back in 2014. Let me give you a bit of background about a prior venture. It was a company called TRL. It’s a company you probably would not have heard of before, but in the defense, in the intelligence, world, it’s well-known, because most people with GCHQ would know, because they would use the solutions.
It used to be top secret, but the acquirer, L3, was very happy to tell the world that that was the only top secret–grade hardware crypto supplier to the British government and their NATO allies. It’s difficult for me to say. You have to start playing with that hardware or software for you to get the feel, because no one understands a single enterprise’s own infrastructure and their parameters and their bottlenecks, and it’s only through that kind of experimentation or inventory audit that you can start getting a feel on what you have to optimise for your own company or agency, because no one in the world can do it for everyone, for everything. We all have to start from somewhere. I believe the VPN is a good starting point.
In parallel, a lot of people are now talking about Web 3.0. Web 3.0 is also a blockchain-based economy. I’m sure you know.
Web 3.0 is built on elliptic curve, which is not PQ. In that community, some people are arguing, “Hashing is PQ.” Of course, hashing is PQ, but I’m talking about the other stuff, about the signing, about when you have a quantum machine, it’s not just about cracking it. It’s about whether they can be faster in doing the 51% attacks and disturbing your block creation and writing and confirmation. Even if some diehards think, “That blockchain is still PQ,” how about the underlying infrastructure? How about the crypto wallet we use today? How about now I have to sign the coin to you? How about my key recovery and the transmission and all the way to the exchange? All of that, the quantum machine can stop fiddling and altering your balance on the wallet and at the exchange, even as simple as reporting. All of that will need to have a complete rethink.
Web 3.0, the one difference I see is, if people want to go for DeFi, that’s very good, very efficient. But at the same time, if something goes wrong, you don’t have any recourse. You don’t have a regulator or a central bank to say, “There’s a hack at the central bank. You have to give me —” because I know in the U.S., the Federal Reserve has to guarantee a $250,000 compensation if the hack was carried out at the bank level and it’s not your fault. In Web 3.0, no one is going to give your money back. These are also policy considerations.
I was going to ask about this. Are you going to be touching anything in blockchain? Are you going to be trying to come up with some partial solution, at least?
Yes. Let me tell you another joke, because I was invited to a blockchain conference about five years ago. When it was my turn to speak, I said, “Blockchain has no security. Blockchain only has immutability to prove what has been recorded there.” I was almost thrown out of the conference in front of 250 attendees because the organisers said, “Hang on — I invited you to talk about how good blockchain is, rather than problems.” I said, “Yes, but immutability is different from security.” Now, in the past few years, a lot of people have announced, “We need to put security layers on top” and so on. We can address that in several directions.
In the community, people are now starting to think, for all the layer 1 protocols today, none of them are PQ, so there is a need to have a quantum-safe layer 1 protocol. Now, all the hacks that we’re seeing today, they are not necessarily attacks on the layer 1. Other people on layer 2 are trying to build bridges to be interoperable. That’s when you start having problems. That is an area that we have to think about as well. I have also coined another phrase for this kind of work. I’m not calling it DeFi, because I do think we are probably still two, three years away from being able to have good DeFi solutions. I call what we have today HyFi. It’s not a music system, but it’s hybridised finance.
If you imagine, a lot of the criticisms in the Web 3.0 world are that Web 2.0 is bloated, very bureaucratic, very slow-moving, blah-blah-blah, which is true. But at the same time, we have been very much a Web 2.0 company in banking, in defense and whatever. We have spent decades perfecting certain solutions, whether you call it ID or a risk management platform or whatever. Why don’t we be objective? Ignore the bureaucracy, but grab the best modules and products solutions from the Web 2.0 world and use it for Web 3.0. Web 3.0’s advantage is, people can experiment and go live very quickly in no time. So, why do we have to reinvent the wheel? Why don’t we just grab the best and try to translate that for Web 3.0?
We are in fact doing some of those translations right now to see what we can do in order to plug some of those gaps, because people used to argue with me and dismiss my thinking, but I don’t think they’re arguing anymore, because every week, you see a new hack. People have become numb, whether you’re losing $100 million or $625 million or whatever. Something needs to be done about it, because right now, a lot of the remedies there are compensations from the providers. One day, those providers will run out of money to compensate the users who have been hacked. I believe you do need to have a new way of thinking about it, and HyFi is the way forward.
Yes. Probably, we’re going to want to have you back on in the future to talk just about that as you come up with some practical approach, because right now, blockchain needs a lot of help.
There are very few blockchains that have any kind of post-quantum cryptography. We had one on, the Quantum Resistant Ledger. We’re going to have them on again with this other lab that’s working on the solution, so I do want to talk more about it in the future.
Yes. Sometimes, it’s not just through the sheer quantum-cracking capability. Sometimes, it’s got to do with the protocol itself, because even if you have just a split second that the public key is reviewed, if a quantum machine is faster than what you are capable of processing, then they already have that tiny microsecond advantage, and then they can do something about it. Some of the protocols will need quite a bit of rethinking, but I know a lot of the hacks today, they’re happening at the cross chain type of connectivity. I would have to say, the hackers today are clever. I admire their ability. It’s a difficult battle, but that’s where a lot of the issues will arise. We need to think about whether we can also come up with PQ solutions for that.
Agreed. Thank you so much for joining and sharing your breadth of knowledge. Like I said, I’m definitely going to want to have you back when you’re delving more into the blockchain world.
Yes. We’re working on a module which can help the Web 3.0 world as well as the Web 2.0 world. I believe that will reside very healthily and very correctly in the HyFi — i.e., in between — that you can do both, because there’s a lot of new stuff that we can learn from the DeFi world. Then, we also know that there are some very good modules. If we ignore the bureaucracy management and so on, some of those Web 2.0 modules you can apply immediately.
That will be our next topic, for sure, in the future. Thank you so much.
Thank you for having me.
Now, it’s time for Coherence, the quantum executive summary, where I take a moment to highlight some the business impacts we discussed today in case things got to nerdy at times. Let’s recap.
Post-Quantum has been thinking about end-to-end post-quantum cryptography solutions since 2009. They’ve been taking the coming crypto apocalypse very seriously. In 2017, they submitted NTS-KEM Cipher to NIST’s hunt for post-quantum cryptography candidates. It has since merged into Classic McEliece and is a third-round finalist.
Post-Quantum’s VPN solution wraps traditionally encrypted data like elliptic curve in a PQC wrapper. This is a hybrid approach. If traditional encryption falls to a quantum computer of sufficient power, the PQC layer hopefully won’t. They have a system that is similar to browser handshakes in that it can determine which ciphers are present and usable by both parties.
NTS-KEM was first used in a secure messaging app in 2013. The company has learned a lot in developing and testing it since then. In 2021, they began to work on the NATO proof of concept for the National Cyber Security Center, or NCSC. This POC was designed to test communication flows that could maintain security against the quantum threat. Remember Y2K? This is fighting against Y2Q and is in many ways a more prevalent problem. The NCSC POC involves stress-testing against the hybrid Post-Quantum VPN for almost a year. This contained a different hybrid approach than the original NTS-KEM one.
Solutions like this VPN are already important, because certain information has a long shelf life. Credit card numbers are not eternal, but health information is, for example. Post-Quantum is also concerned with quantum-ready identity management, in keeping with their true end-to-end approach.
The threat to blockchain has been discussed on this show before. Post-Quantum has already given thought to securing DeFi and Web 3.0. Their proposed approach may involve something called HyFi — hybridised finance. Can some traditional modules from the Web 2.0 world aid Web 3.0 security? It will be interesting to see what the team comes up with.
That does it for this episode. Thanks to Andersen Cheng for joining to discuss PQC and Post-Quantum. Thank you for listening. If you enjoyed the show, please subscribe to Protiviti’s The Post-Quantum World, and leave a review to help others find us. Be sure to follow me on Twitter and Instagram @KonstantHacker. You’ll find links there to what we’re doing in Quantum Computing Services at Protiviti. You can also DM me questions or suggestions for what you’d like to hear on the show. For more information on our quantum services, check out Protiviti.com, or follow ProtivitiTech on Twitter and LinkedIn.
Until next time, be kind, and stay quantum curious.