I would just rewind to go back to my background, because I trained as a computer auditor more than 30 years ago, so I have always looked at cyber security with end-to-end thinking in mind. Over the years, we have seen a lot of solutions, and they seemed to be pretty much isolated. You can have the best firewall, best intrusion detection, best whatever, but do they talk to each other? Do they link up to each other? I spent a long time trying to think about what would be the perfect ecosystem. Then my cofounder and I were thinking about “Before we can even do that, we have to think about the public key cryptography that we use today — namely, our RSA elliptic curve. They’re not quantum-safe. When a quantum computer comes into existence, then we can just forget about it, because everything would be broken.
It was on that basis that we started doing a lot of heavy R&D on how to make things work, because public key cryptography is used in everything, whether it’s on this video call that I’m having with you, on a wireless chat I have on my iPhone — we use it. Now, if you want to come up with the best solution, I’m sure a lot of your audience is already aware that NIST has been running this competition for the next-generation PQC standards. We submitted our proposal back in 2017. We were one of 82 submissions. Now, hopefully, NIST is going to announce their final standards any day now in April. They have three finalists left in lattice and one left in code-based, and that’s where we’re focused on. That’s the background on our thinking behind public key cryptography.
You also mentioned, in terms of practicality, there is a difference between us and our peers or competitors because a lot of our peers, they are academics — they focus on optimising the mathematics — but we are all ex-engineers. We look at the real-life problems to see whether we can come up with some solutions for it for real-time and real-life use rather than some mathematics, which can have all the glory — but whether you can put it into practical use, that’s another matter. That’s the NIST competition.
We also submitted a Hybrid PQ-VPN proposal to IETF which formed the foundation of the next-generation standard in secure connectivity. On that one, it’s interesting, because I can tell you from experience, it’s been a bit of a struggle for us. That’s probably an understatement, because I was always the only person shouting from the rooftop back in 2009 and 2010 and so on, saying, “Quantum is coming. That really will be the end.” People were all laughing at us, saying, “Let’s worry about it post-quantum.”
I coined a number of terms which are now widely used by the industry, because even if they’re skeptics thinking it might be 10, 20 years away, how about what we call the “harvest-now and decrypt-later text,” because now there’s abundant evidence that certain adversaries are diverting the internet traffic to certain Eastern European countries or even Russia for two, three hours at a time, and then they are back to normal. We have to do something about that.
Now, a little bit on the VPN. Obviously, our recent conclusion on that experiment with NATO has attracted some attention. The reason it came about was, after our submission to the IETF, it caught the eye of certain departments, and we did the crypto libraries for that, and then it caught the eye of NATO, because we have known NATO for a number of years now. We have been very friendly. We collaborate on several projects. Last year, they approached us, saying, “This Hybrid PQ-VPN — we have to try something out. Do you have something available?” We differ from our peers because most of them focus on putting algos on chips or checking the same boxes, but we focus on enterprise software solutions.
We thought, “Just imagine, if I go to a CISO or a bank CIO and say, ‘NIST has now come out with this new standard,’ whether it’s lattice- or code-based, it doesn’t matter.” If I’m asking you to throw your RSA away today and adopt a new standard, most likely, no enterprises would allow that to happen. If I tell you what if I give you a hybridized solution, belt and braces, I give you the elliptic curve in a tunnel, I repeat the RAM with a PQ adaptor, then I can bring in the various NIST candidates. It doesn’t need to be the final standard. It can be a number of those. When we start to do the handshaking, we can detect. If we’re both still using RSA, we downgrade that to our current primitives, but say if I have upgraded mine, and you have upgraded yours to something else, then we can look and see where the common grounds are, and then we do the connection that way.
It gives people the flexibility on the migration, because quantum migration would take many years. It brings a different kind of characteristics. It’s not Y2K, because Y2Q is entirely different. Let me expand on that one. I was on J.P. Morgan’s Y2K migration committee. Thinking back, it was not too onerous a job, because we had a definite deadline, but unknown impact — no one knew what’s going to happen. The actual project was relatively onerous but simple, because all you had to do was to go through every single module to see whether there was a date filled which would reset itself to the 1st of January 1970, and then you just correct it, and then you move on to the next and the next.
Y2Q is the other way around. You don’t know when it’s coming, but when it comes, the impact is going to be 100%, and you cannot just look at all these in isolation, because public key cryptography is all about handshaking. If I’m now trying to connect between A and B, I may have worked something else out which is perfect, but when you now come into the enterprise, now you start handshaking between B and C, C and D, and so on. If you’re not careful at the beginning, you may have an impact or a different set of parameters which will affect your subsequent performance. That’s the future of it.