# Transcript | Coding High Performance, Post-Quantum Secure Blockchain Systems

In coding, libraries make developers’ lives easier, sparing them thousands of lines of code, while also increasing performance and security. And everyone’s concerned with security as fault-tolerant quantum computing approaches. Geometry Labs has released the “lattice-algebra” library to bring a high-performance cryptographic library to developers interested in using post-quantum cryptography in blockchain and other applications. Join host Konstantinos Karagiannis for a chat on this approach with Mitchell Krawiec-Thayer and Brandon Goodell from Geometry Labs and Michael Strike from The QRL.

**Guests:**

Mitchell Krawiec-Thayer & Brandon Goodell — Geometry Labs

Michael Strike — The QRL

**Konstantinos**

In coding, libraries make developers’ lives easier, sparing them thousands of lines of code while also increasing performance and security. Geometry Labs has released a lattice-algebra library to bring a high performance cryptographic library to developers interested in using post-quantum cryptography in blockchain and other applications. Find out more about the future applications of this library in this episode of the Post-Quantum World. I’m your host, Konstantinos Karagiannis. I lead quantum computing services at Protiviti where we’re helping companies prepare for the benefits and threats of this exploding field. I hope you’ll join each episode as we explore the technology and business impacts of this post-quantum era.

So, today is a bit of a first for Post-Quantum World. We’re having three guests on at the same time, so see if we can do the best we can to detangle this for the listener. So, first of all, we’re welcoming back our very first guest ever. That’s Michael Strike, Director of Outreach at the QRL. So, Michael, if you want to say hello for a second.

**Michael**

Hello for a second. Is this working?

**Konstantinos**

There you go.

**Michael**

All right.

**Konstantinos**

So, now you know his voice, and we have Mitchell Krawiec-Thayer. He is the President and Chief Scientist at Geometry Labs. Hey, Mitchell.

**Mitchell**

Nice to meet you.

**Konstantinos**

We have Brandon Goodell, Senior Cryptographer and Blockchain Architect at Geometry Labs.

**Brandon**

Howdy everybody.

**Konstantinos**

All right. So, that’s our team here and we’re all here to discuss quite an intense topic, so we’ll move into this slowly. Basically, we're going to be looking at what we can do to expand the reach of post-quantum cryptography and how it can be implemented in other solutions. If you recall, we had Michael on to talk about the QRL. So, I'm going to let him give a little recap of what the QRL is for some of you who might not have heard the first episode.

**Michael**

So, the Quantum Resistant Ledger is a native blockchain that was purpose built from the ground up to be quantum-secure. As many of the viewers know, The blockchain’s immutability is one of its greatest features but also, it could be also looked at as a bit of a crutch in that old address and schemes never really go away. So, we’re purpose-built. We are built off an NIST XMSS-recommended signature-based hash scheme. So, our signature scheme uses cryptographic hash functions instead of elliptical curve cryptography. So, we’ve been running for I think about three years with 100% uptime. So, if you have any questions, you're welcome to reach out to me, [email protected] or visit us on www.theqrl.org. Thanks, Konstantinos.

**Konstantinos**

As you mentioned, ECC, that’s susceptible to Shor's algorithm and this introduces a whole world of quantum cryptographic attacks and the estimate on how many qubits you need to start attacking blockchain, it’s not really that large. It’s 2,500 logical qubit. We still have aways to go and we have to figure out what logical qubits really mean, how many you need to get there. There are some new technologies coming where qubits are getting better and better, so it might be that you only need 2,500 qubits, period. So, we’ll see how that works out in the coming years.

**Michael**

Or 20 million noisy. [Laughter]

**Konstantinos**

Or 20 million noisy, exactly. There are some exciting announcements coming, things you’ll probably hear on this show too. So, we’re not sure yet, but that could only be a handful of years away, so we have to start preparing. Now, recently, QRL is partnership with Geometry Labs and they were able to announce the release of something called lattice-algebra cryptographic library on GitHub. I think maybe we could start with Mitchell, if you want to give a high level of what that is.

**Mitchell**

Yes, sure. So, we've been very focused on making sure that practical post-quantum crypto is ready before quantum computers are ready to practically break current crypto. So, what we've been working on for the last couple of months is releasing some free and open-source software that is designed with the kind of mathematical primitives to start building a variety of solutions for that and so it's a - actually, can I turn it over to Brandon for the kind of solutions?

**Konstantinos**

Sure, absolutely. Yes, you guys could feel free to hand back and forth. I know you work together closely.

**Brandon**

Sure. Like Konstantinos mentioned, my name is Brandon Goodell. I have a PhD in mathematics and now I blow my time on cryptography. So, the interesting thing about the lattice-algebra library is its sort of a data representation issue. Who said it a minute ago everybody in their life needs data processing?

**Konstantinos**

That was Michael.

**Brandon**

Yes, that was Michael. Just right before we started, he said something like that. Here's what is really interesting about lattice-algebra is that - at least one of the interesting things for me - lattice-based cryptography, it all boils down to a bunch of linear algebra and when you hear the term, “Linear algebra,” if you have any sort of experience from universities studies or even like the internet over the last couple of years, there’s so much machine-learning possibilities out, there there's data representation and data mining opportunities out there. One of the first things you do in a linear algebra class is you learn how to represent data optimally so that you can perform computations on it very efficiently and you don't waste a lot of computation time. If you think about it, like multiplying two polynomials together, foiling it out, it takes forever because you have to do all of the different cross terms. If you can represent those polynomials in a nice optimal way, then you can do polynomial multiplication very quickly and this is called the, “Number theoretic transform.” So, the lattice-algebra library handles these fundamental mathematical objects, like polynomials, in a way that is safe because the timing that it takes to run an algorithm like a computation on it is constant time, but it's also actually implementing this strange world of linear algebra with polynomials. There's actually a weird analogue in real life and in fact, in biology, to how these computations are handled. This is something that you learn in like a neuroscience class, but when your ear hears information from the surrounding environment, it takes the sound wave in, your nervous system, before that information even hits your brain, it does Fourier decomposition which essentially just like describes the sound wave as frequency instead of amplitude over time. It looks like frequencies and amplitudes. Excuse me. Instead of representing things in a time domain, things evolving over time, you represent things in a frequency domain. So, you don't necessarily look at how long the sound spike lasts, but you look at what frequencies were present in that sound file and this is really the optimal way of representing the data, so that you can do a lot of sound processing on it. In fact, the human eye does a similar thing. You do a sort of wavelength decomposition on the information that comes from your eye before it even hits your brain and it represents that information optimally so that you can do computations on it very efficiently. That's basically what our library does. It takes these keys, these polynomials that are hard to deal with and are not represented optimally, it represents them optimally and then it does all your signature algorithms and everything, and then it pulls it back to the polynomial world. When it does that, it's like going through a mathematical time warp or like a hole in space where you like pop - or a wormhole where you pop in one area and then you pop out another area. It's kind of magic. So, the library is basically designed to handle all this mathematical cryptography stuff as optimally as possible in a safe way.

**Konstantinos**

You wanted to become…

**Michael**

To give it…

**Konstantinos**

I was going to say you wanted to become a new kind of hidden layer that people don't have to worry about.

**Michael**

Yes, sort of like - right now, if you’re programming, you got to use integers or you got to use floating point numbers, right? These are fundamental types inside the computer and basically, what we've done is we've taken these cryptographic objects, polynomials and vectors of polynomials, and turned them into a type in Python so that you can just add them together and multiply them together as naturally as you would an integer in Python. That makes programming cryptographic protocols like a snap, like 100 lines, 20 lines something like that for a good signature algorithm, and all you're doing is you’re putting all of the security and development problems on the underlying libraries, like SHA-256 expandable output function or other issues, that way you don't have to worry about any implementation risks of like some developer coming in and trying to program up some cryptographic protocol, screwing something up, and then as a consequence, producing something that’s insecure. We want to be able to eliminate that as much as possible.

**Konstantinos**

Anything resembling homebrew is always a bad idea. [Laughter] Mitchell, you were going to say something there?

**Mitchell**

Oh, I was going to comment. [Pause] I was going to follow up the comment about…

**Konstantinos**

Okay. We could just touch back. That’s fine.

**Mitchell**

Sorry. I’m just going to cut this a little bit. What I was going to comment was a concrete example of say we have two polynomials we want to add, with all the stuff being under the hood for the top-level program, they can just have the theory paper open on their left monitor and have their IDE open on the right monitor, and literally just grab the theory stuff out. We’ve actually done some side-by-side comparisons of the Python code and the theory papers, and they're very similar. So, in addition to taking a that coding complexity out that a tech surfaced out, and all the complications that come with coding that up, we now make it more accessible for cryptographers to kind of interact with these things, and not just have to go deep in the weeds coding up a critical mathematics before that. So, this is a massive productivity tool for people developing cryptography applications, but they can just fast forward half or two-thirds of the way through the project, have all those primitives at their disposal, and then just kind of like build out the high-level piece that they want.

**Brandon**

There's this really common thing in cryptography where everybody tends to just program up their library that does exactly the protocol that they were trying to code up, and then they go to another protocol and they have to recreate an enormous amount of their infrastructure because they're just coding up each protocol on its own. There's a good reason to code up each protocol on its own without having like a common library underneath them which is you might be centralizing a development risk or some sort of insecurities on the code, you're centralizing it in just a single location that's being referenced by multiple deals, so that's very problematic in the development world. However, when you have to recreate your infrastructure over and over again, what you end up with is people programming the number theoretic transform uniquely for each instance of each cryptographic protocol that they are working with, and it's a slow process. By typing everything out this way and turning things into types, like integers, the idea is to make a productivity tool that just you can take the math and put it in the Python and hit go. That’s very…

**Michael**

So, without being a programmer myself basically and for everyone else that might be watching this that might not be algebraic expert or programmer, I mean essentially what we are talking about is a modular approach to be able to build out probably quantum secure code without having to be an expert in each of the respective fields that would be required to do so. That's what I heard. That's the way I understand the library to be. Konstantinos, if you don't mind, I got a question or two.

**Konstantinos**

Yes, and I was going to ask you too to just explain the biggest thing that you’re hoping this brings to QRL.

**Michael**

I can do that. Really, what the goal of this - I think with the partnership, the goal here is to - as part of our educational research, we are trying to take novel approaches to solving problems that people have thought about but maybe don't have solutions for, and I think being able to provide essentially what some might consider a software development kit or a modular architecture to being able to build their own applications, I think that's really powerful because to me, it’s a bit like a platform, like an operating system or like a smartphone. You don't have to know how an OLED display works. You don't have to know how all the hardware works. You don't have to be an RF expert. You don't have to build the phone. You already have the modules there. You can just call them and they do one thing. From my understanding, they do one thing and they do that one thing really, really well, right, and deterministically. So, what we're trying to do is essentially educate and empower everyone else to be able to build their own quantum secure applications and services. Some of those in the future, based on the life cycle of how I myself and others view QRL projects, some of those will be unchained on the native QRL blockchain. There's not a lot of players in this space, so we’re in some ways leading some areas of this space. The space will grow and I see our, for lack of a better word, knowledge market share growing as well as we put more and more time into these. I was going to ask Mitchell and Brandon. We’ve been talking about these primitives in algebra, I think one other thing I didn't hear said is this enabled a lot of features that the QRL project will be able to take care of that will handle some of the problems moving forward. For example, like scalability with signature aggregation, I know you guys have built out lattice-based proof of stake signatures plus the aggregation that goes on to that so that the information can actually be brought to scale if you’ve got 10,000 validators. One of the questions, and I’ll let either one of you talk about this, what new capabilities like the lattice-based signature’s proof of stake consensus Lightning Network payments, what are those going to be able to bring this code? What functions are those going to be able to bring in a post-quantum secure world for people that might want to implement them? The other question I personally had I think was - the signature adjudication piece is interesting to me. I wonder if someone could explain what the costs are in aggregating all signatures together. I'm assuming that's just a CPU cost in order to do that upfront, and then on the backend, you save space natively on the chain but take it away. Either one.

**Brandon**

Oh, I got it. I can do this one, and Mitchell, feel free to interrupt me at any moment. So, one of the cool things that we can do with the lattice-algebra - by the way, lattice-algebra is the name of the repo, or the library that we put out, right? So, throughout this conversation, it’s possible that mathematically-inclined people are going to be like, “Lattice-algebra, what do you mean by that?” I mean, the software package. One of the cool things that we can do with the lattice-algebra library is implement signature schemes very quickly and easily as long as they don't have key distributions that are a little wonky, and that is a technical term. [Laughter]

**Konstantinos**

I use it all the time. [Laughter]

**Brandon**

Right. It's a very useful technical term. So, one of the cool things that you can do is aggregate signatures. So, a Schnorr signature is basically a linear combination of two keys. You have a key X and a key Y. You hash your message to get a challenge C and you compute X+CY, and that’s basically the Schnorr signature. In order to verify it, you then use that as a secret key and you put it through whatever secret key, the public key map you have access to. So, in the discrete algorithm world, you exponentiate. You go G to the X+CY. On the other hand, in the lattice world, you take a dot product. It’s like linear algebra from college where you just take a dot product between X+CY and this probably challenge elements. There is this whitepaper that was published by Dan Boneh and Sam Kim a couple of years ago and I shouldn't say published because it's a whitepaper. It's not peer-reviewed. It's just been on the internet for a couple of years as a pre-print. It has an interesting non-interactive aggregate signature scheme in it that’s sort of like Schnorr-like from the discrete log world, and it sort of has the same sort of algebraic structure to it. In fact, if you look at things from the correct perspective, they’re in fact just two instantiations of the same scheme, Schnorr and this Boneh and Kim scheme, but they go ahead and prove in this paper that those signatures can be aggregated and for years, the Schnorr signature aggregation thing has sort of been like an urban legend. Everybody says that it can be done, but nobody has really sat down and proven it, written up a security model, and they did that in this paper. That's like the discrete log world, but then they carried it over to the lattice-based world and they have this kind of interesting aggregate signature scheme. When we were reading through it, I noticed that the parameter selection section seems to be thrown in there last minute and I don’t want to criticize Boneh and Kim at all because it’s not been peer-reviewed, right? It's not been published. They just posted something that they had started to work on the internet. I picked it up, and then I started working on it from there. As a consequence, we can do aggregated signatures. So, the way aggregated signatures work is instead of having to check each signature, individual leads that goes to check some smaller structure than all of them, right? Because I can aggregate signatures by just stacking them and then I just check each one, and I can call that an aggregate signature. Congratulations, but I didn't save any space or time or anything, right? I just listed them in order. So, the goal is to instead of taking an O(N) amount of time or amount of space to describe an aggregate signature, the goal is to describe it in some sublinear size, like algorithmic size. So, the aggregate signature scheme is supposed to take N keys and a log of N space in a signature size, whereas a normal signature scheme, it's a one-time signature scheme requires N keys and N signatures. So, you’re taking all of the weight from all of the signatures and you're chopping it down to just a single signature. So, instead of having to post 50 keys and then 50 signatures, you just post 50 keys and one signature which is great. You want to be able to merge these signatures together in a way that doesn't require interaction between all the signers so that when people announce transactions on the network, they can aggregate those signatures together without interactions from the original signers and then the stakers can make their signatures. They can aggregate those together and get 64 signatures or something like that on the space of one signature. Of course, the size of the signature and the size of the keys really determines how efficient the whole setup is, but it looks like we have - parameter selection is still going on and I don’t want to get in the weeds on that, but we have had some very, very positive movement in the parameter selection department. We’re pretty sure that we have signatures that are at least competitive with the NIST competitor scheme, Crystals-Dilithium, if you compare just stacking end signatures and the keys, which is sort of unfair because Crystals-Dilithium keys can be reused multiple times and our scheme is a one-time signature scheme. The idea is basically to take all of the weight that you have to put on the blockchain and that's end signatures and just basically throw most of it in a way that’s still verifiable. That’s a fun little question in the whole Crosstalk.

**Michael**

So, the interesting part of that, of what you said was it’s a bit like a hash. You’re going to be able to agree on that aggregated signature string, but not necessarily extrapolated at a point in the future at a - or excuse me, not extrapolated based on a point in the past of what all those signatures were, right? Is that correct?

**Brandon**

Yes, it’s not like the…

**Michael**

You can’t derive them or get them back if you were to go back in time.

**Brandon**

It's not like a zip file sort of thing where you’re like compressing them and then you can expand them back again. It’s a compression only, right?

**Brandon**

One way.

**Michael**

So, you take all the signatures, you map it down to a single small signature, and everybody can check that small signature against the keys and the messages, and make sure that that thing is a valid signature in a way that’s Crosstalk.

**Brandon**

Then, all you need is consensus and then you can just move on in life, go to the next block. I’m sorry, Mitchell. I think you were going to say something.

**Mitchell**

No, that was actually a good segue. I was going to zoom out kind of to the applications for this. So, one of them of course is proof of stake consensus mechanisms and because of the way these signatures are structured, we could actually aggregate transaction signatures into other block signatures if they’re using the same underlying map, then you can imagine that that kind of is like a tool for building things. It starts to look not only like proof of stake consensus or aggregating signatures. You could use that for on-chain governance, right, or for Unintelligible if you want a K-of-N or M-of-N multisignature wallet, right? So, there are 10 parties and if any six of them will, say, arbitrarily sign the transaction, you want that to be valid. So, that could look like anything like multisig QRL transactions to on-chain governance, anything like that where you want to keep your footprint small on-chain, do that work off-chain and then upload something small for verification. Then kind of branching out beyond the signature aggregation piece, the other extensions that we are building with the lattice-algebra library included adapter signatures which are glossed of the technical piece. The applications are cross-chain atomic swaps which are these trustless decentralized where we could swap between QRL and bitcoin or QRL-Ethereum. Brandon figured out a way to trade secrets between like lattice side and the elliptic curves side which is about all these other currencies used.

**Brandon**

I think, and it had improved and it’s secured, but I think.

**Mitchell**

Okay, what we think.

**Michael**

We’ll cut that for part that we’ve done.[Laughter]

**Mitchell**

Then other application is payment channels. So, this is within the QRL ecosystem, starting to build something that looks like the Bitcoin Lightning Network where you can now have a second layer, move a lot of traffic off the base layer, and only settle on to that when you need to open or close the channel. Then there's an even further applications out there that we're not currently building other roadmap, but a variety of zero knowledge arguments which Brandon, do you know off the top of your head some of the like cookbook parameters?

**Brandon**

Yes. I don’t know much about parameters from that. So, with…

**Mitchell**

Or not parameters.

**Brandon**

With small medications too, or okay. So, basically, all the signature schemes that we've described so far, the aggregated signature scheme, the adapter signature scheme. These are both doable in a very small number of lines of codes with our lattice-algebra library.

**Mitchell**

With the lattice-algebra library.

**Brandon**

Specifically, I can in fact open it up really quick and just because I'm curious. Then, in addition to that, there are certain math papers out there, cryptography papers that proposed like a whole ecology of crypto protocols, simple things like accumulators, right? Simple things like commitments and range proofs, right? Very sort of like basic cryptographic tools that haven't really existed in the lattice world for a long time because before now, lattices were way too inefficient. Lattice cryptography was too large. It was thought to be too large. It turns out, we can actually get lattice cryptography down pretty small. If you look at like the SWIFFT compression function which is based - Swift with two Fs, that’s based on the fast Fourier transform and that’s what I was mentioning earlier about the numbers theoretic transform and the human eye, and the human ear. Their parameters are super, super small for that. Of course, all they're trying to build is a function that’s collision-resistant. They don't have all these extra properties like unforgeability and stuff like that that our signature schemes have to deal with, so their parameters are made super small. So, anyway, until recently, the lattice algebra - lattice cryptography-based schemes tended to be too inefficient. With the lattice-algebra library, we hope to modify that and with the access to like this whole zoo of cryptography protocols that have come out over the past five or six years, it's been like a renaissance in cryptography. The idea is to be able to start to rapidly prototype things like an accumulator scheme, assign commitments scheme, or a blind signature scheme, or something like that, just rapidly prototype these things using our underlying lattice-algebra library. I'm pretty sure I answered that question. Did I get that?

**Konstantinos**

Yes. I wanted to pull us out into this concept of proof of stake. I think it's incredibly important. People have argued that that's what will save the world literally when you're talking about energy consumption and everything. So, first of all, maybe Michael, you want to talk a little bit about proof of stake and then we could talk about how the lattice-algebra library will help enable that for both QRL and maybe other things like Ethereum.

**Michael**

I almost don't know if I should because proof of stake versus proof of work versus proof of X is very tribalistic.

**Brandon**

Contentious.

**Michael**

Contentious and tribalistic, and spears do come out.

**Konstantinos**

Yes, well… [Laughter]

**Michael**

I'm going to give my opinion, but with the prefix of these are my opinions and they may not be representative of the entire team. I think that I like both proof of work and proof of stake. I think that there is something novel about being able to store energy, actual stored energy. I like the proof of work system which like solved the bias and default tolerance issue, right? There's a lot of conversations which I think we were alluding to Konstantinos about proof of work not necessarily being green and there are some jurisdictions that are now banning bitcoin mining if they can’t at least prove that they’re green by either a running direct green or showing proof of green credits, right? At the end of the day, what I - I’m trying not to make predictions here. Well, let’s just do it anyways.

**Konstantinos**

Jump, go ahead.

**Michael**

I see bitcoin as a bit like the World Reserve currency in that except back when the US Dollar was backed by gold. Back when coins were actually made of silver, pre-1965, and back on the gold system, pre-Bretton Woods, you could exchange your dollars for gold in that, or excuse me. You could exchange your gold for dollars and you would get the dollars and that dollar was essentially a piece of paper that was worth something, and you know that it was worth something because it took actual work. There was blood, sweat, and gears [Laughter] that took to be able to get that out of the earth and to me, that’s something that can’t be gained. At the end of the day, I'm a bit of a purist, black or white thinker, and there's a strong - there are arguments against proof of stake in that, “Okay. So, I just put this here. I don't do anything and the rich get richer.” You’ll hear narratives like that. I think there's a place for both of them and at the end of the day, what I say is proof of stake, proof of space, proof of element all being pegged to proof of work system, in this case, likely bitcoin. Now this, of course, hasn't taken into the consideration the quantum threat because the QRL acts as your quantum insurance policy against, well, proof of stake and proof of work in 99.9% of other blockchains that are dependent on elliptical curve cryptography and susceptible to Schnorr’s algorithm in a sufficiently powerful on a computer. I have the opinion that all of the proof of access complement proof of work because there's something ideological about having to actually do something in order to get something. Because in the history of money, anytime you don't have to actually do something to get something, that system itself always seem to evolve and take form in a bit of what I would probably best describe as cycle of empire theory and the collapse of the Roman Empire eventually, and this takes a long time. So, I probably went off on three or three different tangents there, but that’s my opinion. [Laughter]

**Konstantinos**

So, I guess either Brandon or Mitchell, how would lettuce algebra help move to at least a partial proof of stake system?

**Brandon**

Dude, I have so much to say. [Laughter]

**Mitchell**

I know. Yes, me too.

**Brandon**

I stopped myself. I’m like…

**Konstantinos**

We can start with what you want to say.

**Brandon**

I hope it’s a good thing.

**Konstantinos**

Just touch on how this…

**Mitchell**

Be kind, be kind.

**Konstantinos**

Mitchell, do you want to go?

**Mitchell**

I can briefly comment that I actually am very interested in a variety of consensus mechanisms. I think it's one piece of this type of framework that’s actually in many ways orthogonal, right? You can take a given currency with various features, whether their privacy or smart contract, whatever, and you can swap out proof of work/proof of stake, right? So it's kind of this orthogonal piece. I really like the Nakamoto Consensus and proof of work’s properties, right, where you have this notion of the longest chain and evidence of what went into that. Also, environmentalism is a very real concern and having the best of money doesn't matter if I don't have air to breathe so I also am very interested in alternatives to that. Proof of stake is definitely one of the ones that I would say more developed and battle-tested and so, of course, they're putting some time there. Personally, I like proof of space or proof of spacetime. I think it has a lot of the Nakamoto Consensus as properties where it can run a commercial hardware, there is physically showing - in this case, it’s bits of storage rather than electrons of hashing, but you end up with similar properties. You can get heavier chains, lighter chains, and all of that, but that’s maybe a different story. Brandon, what were your thoughts?

**Brandon**

I'm super sympathetic with Michael's perspectives and yours also. Environmentalism is deeply important to me. Unfortunately, there’s this really habitual thing that happens in the Silicon Valley world and this is a criticism of that particular culture, but it happens almost everywhere where people are like, “I can sell you a product that does X, Y, Z,” and it turns out after decades of selling this product that does X, Y, and Z, somebody can mathematically prove that either you can have X and Y or Y and Z, or X and Z, but you can't have all three at the same time. So, there are these theorems in mathematics called impossibility theorems that are basically two or three theorems that say, “Okay. You can have any two of these three properties, but you can’t have all three.” Oftentimes, a reasonable sets of assumptions come in threes and reasonable set of assumptions come in threes that are mutually exclusive in this way. I personally have read a variety of proof of stake papers that are rather critical of the underlying assumptions that led to the security argument. For example, there's this paper called, “Rethinking Large-scale Consensus,” by Elaine Shi or Shai. I believe it’s Shi. I've not met her or seen her speak actually, but that paper proves that under some reasonable set of assumptions like you want people to be able to join into your staking system at arbitrarily long periods of time in the future, right? You have late spawning so people can jump in some time after the Genesis Block can start staking. That assumption, unfortunately, is contradictory with some other reasonable assumptions and so it seems to me that proof of stake is not provably secure under reasonable assumptions. Having said that, you can tie stake together with work and alternating chains and that can offload some of the energy cost of proof of work onto proof of stake, and then you can use the proof of work’s security as like the foundation for the proof of stake for a certain number of blocks, and you can still gain those security claims. I really like proof of space. I really like proof of time when they're described in the correct context or rather, I don't want to say, “The correct context,” because that means that there might be something wrong out there, but there's a certain context that’s the real-world model of how we want to use these things, and I like proof of space for those things. Michael, you mentioned something that reminded me of the book called, “A Thousand Years of Nonlinear History,” where they talk about how every single time humanity basically gains access to a new energy source, our whole society crystallizes. So, when we went through the agricultural revolution, all of a sudden, we could collect all the solar energy that was stored inside of these plants and all of a sudden, we recrystallized into the agricultural revolution. Then when we started burning coal for energy, we crystallized during the industrial revolution into a different form of society, and I’m kind of intrigued by cryptocurrency in part sort of like a picture of that process because what you're seeing is like people wanting to go back to something like gold standard, but we don't want to make our money based blood, sweat, and tears and people dying on the ground, right? It's an interesting question but overall, I think that there are some really interesting ways to move forward to mitigate energy costs and proof of work. Unfortunately, I think proof of stake, let’s say that it's based on reasonable, but contradictory assumptions, I think.

**Michael**

So, human behavior. Proof of stake, really, the consensus is really based on - at the end of the day, if you look at things completely objectively, okay, it’s really based on expected human behavior based on game theory, right? If I have a large stake and if I control a lot of that portion of consensus, there’s a strong game theory in human nature that says that, “I’m going to want to make sure that the things that I’m doing are in the best interest of the network.” Our project is moving towards proof of stake and I agree with that, but like I said, I think proof of - I like all the things you said, Brandon and Mitchell, both of you, but at the same time, proof of work, I see it more mathematical pure in that it doesn't depend on biological or subjective variability. It really just depends on math, and math is one of the most objective…

**Brandon**

It depends on physical work. It's the number of electrons that have been moved from point A to point B, right? That's what proof of work is all about and when you're starting to talk about the ledger as a recording of that expenditure of energy, you can't fake it, right? For proof of stake, you have the stake with the model, the corporation basically running your decision-making process.

**Konstantinos**

You’re also subject to other human behavior, right? If a government imposes lockdowns because of power consumption or if a region can’t support the machinery, then you’re subject to all that too. So, that's why some kind of hybrid approach there seems to be the best way. Good amount of work, good amount of stake, and then everyone is kind of covered.

**Michael**

That’s true, but at the same time, someone else will pick up the slack and is that government doing a disservice to its constituents? I mean, go ahead.

**Konstantinos**

Sorry. I’m sorry. Michael. Dude, I agree with you completely. One of the problems with the whole cryptocurrency world is that whatever you based your consensus mechanism on, it tends to undergo hyperinflation. So, if you have proof of work then you have hyperinflation CPUs, GPUs, and ASICs that end up becoming really cheap to produce in a couple of years and they’re in large numbers because work is easy and it is profitable. On the other hand, if you start doing something like proof of space and hard drives undergo this hyperinflation, all of a sudden, Amazon can hammer out any consensus they like and they own the world, right? So, whatever consensus mechanism you base your economy on, it sort of has to go through this hyperinflation period. If we do work, then it’s hyperinflation of work and work devices. If you base it on disk space, then it’s going to be disk space. I don't know what the solution is there for sustainability because it really, really seems like the paperclip machine that's designed to build more paper clips.

**Michael**

To that and case on point, I remember when Chia came out,

**Konstantinos**

I was just about to mention that. [Laughter]

**Michael**

Were you able to mention that? Konstantinos, help me out here. What happened to hardware prices for a 16 terabyte or 18 terabyte?

**Konstantinos**

So, Chia, for those who don't know, it’s kind of funny because the creator of BitTorrent essentially said, “Hey, you know what’d be a great proof of something, how about your hard drive space? The more you have, then you can farm.” They call it farming, so now that song is stuck in my head, anyway. So, you've got this situation where you’re farming and using hard drives and this is kind of funny because BitTorrent always was sort of a hard drive thief, wasn't it? You start downloading movies and music and all of a sudden, your hard drive is full. So, I guess it kind of kept going with the same principle, but yes, prices go up and that affects the whole thing. So, to kind of tie this all together, I just would love like a high-level of let's say two or three of these moves and the future switches, and technologies that you think lattice-algebra help implement to basically kind of bring it all home for the listener.

**Michael**

Okay, super easy. Aggregate signatures are going to make it so that the cost of staking is about the cost of, hopefully, a single Crystals-Dilithium signature which is like two kilobytes or like five kilobytes. That’d be super fantastic. Being able to take all of the stake stuff and non-interactively aggregate it into a very small amount of block space, comparatively small, fantastic. There's a technique called secure farm architecture that we’re looking into right now that’s based on mathematical coding theory and using the hash chain of the blockchain, the chain of hashes, as an authentication structure to run this very simple coding theory-based algorithm so that you can basically shower to your data. So, long-term, local nodes can start forgetting sections of the blockchain that have become inactive and they tend to not even need to track when that happens. It just happens naturally that they stop accessing that part of the blockchain and then they end up storing 1/100 of that part of the blockchain, but in a verifiable way so that if later somebody ends up spending out of that section of the blockchain, they can still reconstruct the entire thing. Super fantastic. So, it’s unlike the aggregate signatures which compressed things down only, this method is like a peeling decoder method where everybody stores 1/100 of the blockchain and then you talk to a hundred different nodes, and then you compare and you can peel off the layers of the blockchain that you need and use the hash chain to make sure that nobody’s lying. Super cool. So, long-term, you can reduce the total percentage of local storage space for nodes. We’re going to be able to aggregate signatures of the overall blockchain smaller that hopefully should enable some sort of hybrid stake work animal that’s secure and if not that, a secure community really wants stake, then hopefully, we’ll be able to do - like my goal, really, my goal is that each block just consists of the output keys and a single signature. That’s it. That’s what I want, and like a header and hashes and stuff like that. Basically, just…

**Brandon**

It does everything. It doesn't sacrifice any usability or use cases. That would be ideal and post-quantum.

**Michael**

Yes.

**Konstantinos**

Let’s remember that right now, in the case of Ethereum, gas prices are through the roof, worse than like the gas you put in your cars, and any of these moves would help take the pressure off of that.

**Brandon**

I should point out that that secure farm architecture stuff is not related to the lattice-algebra library. It's a completely blockchain-agnostic, different animal that can just be layered on top of any blockchain that’s out there right now.

**Mitchell**

It’s like a consensus uTorrent storm or swarm. Sorry, swarm is the word.

**Brandon**

Yes, that’s basically –

**Mitchell**

That’s how we see it.

**Konstantinos**

A lot of BitTorrent feel in things are moving into the blockchain world. You could see it.

**Brandon**

I would say…

**Michael**

I hope we’re making Brandon proud. [Laughter]

**Konstantinos**

I guess, with that, we’ll wrap up and good luck on the project. I have a feeling, in this episode, a lot of people are going to be listening to the section at the end called, “Coherence,” where I summarize what we talked about. I have a feeling this one, they're going to tune in for that part, so thanks very much guys. I really appreciate it.

**Brandon**

Excellent.

**Mitchell**

Thank you.

**Michael**

Thank you.

**Konstantinos**

Thank you guys so much.

**Konstantinos**

Now, it’s time for Coherence, the quantum executive summary where I take a moment to highlight some of the business impacts we discussed today in case things got too nerdy at times. Let’s recap:

The Quantum Resistant Ledger or QRL is a purpose-built blockchain designed to be post-quantum safe. Check out our first episode for more on the project. Recently, the QRL partnered with Geometry Labs to release lattice-algebra. This is a cryptographic library available now on GitHub. Linear algebra comes up a lot in quantum algorithms. It comes up here in this cryptographic approach too. The lattice-algebra library handles fundamental mathematical objects like polynomials and vectors in an optimal safe way to aid cryptographic solutions. Working with these objects as types becomes almost as easy as multiplying integers when writing code in Python, the work its passed on to the lattice-algebra library underneath. Libraries minimize coding time and reduce security risks from the stakes when writing everything from scratch. With primitives at their disposal, coders can start cryptographic projects way ahead of the post-quantum cryptography or PQC game. In this case, the hope is to build post-quantum secure code quickly. Developers don’t need to be experts in the PQC types underneath. Also, the hope is these primitives will aid scalability.

The lattice-algebra library can be used to implement a Schnorr-like one-time signature scheme. This could allow for zero-knowledge proofs and signature aggregation which is useful for PQC wallets and blockchain. There’s a lot of weight on the blockchain today associated with storing multiple signatures. Lattice algebra could reduce this. For example, instead of posting 50 keys and 50 signatures, you can have 50 keys and one signature. Like a hash, this solution can’t be reversed. The team is also working on cross-chain atomic swaps to trade secrets between blockchains. Swap between say QRL and Ethereum in a trustless way. In seems there are many zero-knowledge approaches as possible. The team is taking this library as opening up a cryptographic renaissance of applications in the future. Lattice algebra could potentially aid proof of stakes solutions with lattice-based signatures that take up a miniscule amount of space, maybe some five kilobytes. Proof of stake is an almost political area of blockchain. So, we’ll leave it at that. Another promising future application the team is interested in working on is hashing parts of the blockchain in a way to enable sharding. This means local nodes can forget inactive parts of the blockchain storing about one percent of it only. Think of the space saving there.

That does it for this episode. Thanks for my guests, Mitchell Krawiec-Thayer, and Brandon Goodell from Geometry Labs, and Michael Strike from the QRL for joining. Thank you for listening. If you enjoy the show, please subscribe to Protiviti’s The Post-Quantum World and maybe leave a review to help others find us. Be sure to follow me on Twitter and Instagram @KonstantHacker, that’s Konstant with a K, Hacker. You’ll find links there to what we’re doing in quantum computing services at Protiviti. You can also DM me questions or suggestions for what you like to hear on the show. For more information on our quantum services, check out Protiviti.com or follow Protiviti Tech on Twitter and LinkedIn. Until next time, be kind and stay quantum curious.