can hear you do that in clear. Awesome.
Okay, I see that we have all of our speakers just before we begin would love to do a quick mic check for three Roman Jacob.
I'm going to stay on here.
All good. Yeah, both you coming through. So let's begin. Welcome everyone to this week's mantle, AMA. We're super excited to have our partners from eigenleer join us today. We had literally thousands of questions submitted. So I think it's safe to say the community is
It's also excited. Before we begin, just a few housekeeping items. If you can take a moment to just like and retweet this AMA, it'll really help us amplify what the speakers have to say today. You can also tweet your questions to us using the hashtag build on mantle. Our community team is going to be searching through
for new questions while we're doing the AMA. To introduce myself, my name is Kufmur Drony. I'm a product manager or mantle been working with the IGON-LAR team on our integration. We have some really great speakers for you today and I'll let them introduce themselves. Let's start with Sri Ram, then Calvin and end with Jacob.
- So you're on would you like to introduce yourself? - Yeah, sure. - Hi everybody, I'm Sri Ram. I'm based in Seattle and I've been working on this Eigenlead project over the last more than 18 months. We excited to be here to share our vision and partnership with Mantle.
And I'm Calvin. I'm the Chief Strategy Officer at EigenLayer. I've been there for about four months. I'm really focused on the business side. Prior to this, I was the head of strategy at Compound, the DeFi Protocol for about four years. Nice to meet everybody.
Hi everybody, I'm Jacob. I am a core contributor to BitDow and leading on the product side for Mantle and I am also the founder of a Web3Native Game Launcher called Hyperplay.
So let's start off with just a really introductory question for Sri Raman Kalvan. Let's ask a question we saw a lot in our community polls, which is quite simply what is eigenlayer?
So you're only remuneted if you're trying to talk.
No, I couldn't hear anything for a second there.
So just to reiterate, what we want to cover is, could you just tell people a bit about eigenlayer, how they can think of the protocol and some basic information about what you're working on?
Well, well, in case sure I'm having some trouble, I'm happy to jump in. So yeah, eigenlayer is a restaking protocol. What that means is essentially for obviously pretty recently Ethereum
after the merge fully to proof of stake and there is a lot of ether that is staked across a lot of validators that are securing the Ethereum network and all the transactions and economic activity on top of it and the question
And I can layer sort of asks or rather the question kind of answers is like, could you reuse that stake or rent it out the stake or the validator set to other networks and services and protocols and application
besides just Ethereum. So if you wanted to build certain types of blockchain-based software like Oracle's or Bridges or Data Availability Solutions, you might need to, you might want
a lot of crypto economic security to launch one of these systems. And with eigenlayer, you can rent it from a stake that's currently being used to secure Ethereum rather than try to bootstrap your
security from scratch yourself. So the status quo is like a new network that needs a lot of security. We'll probably launch its own token and own validator set and try to bootstrap that security based from the ground up. But with eigenlayer you could let ETH staker
opt in to restake their state's eth to not only secure the Ethereum network but also secure whatever network or application or service that you are trying to build as well.
That's great. So one way to think of it is it allows other people to benefit from the security that Ethereum has built up over its time, growing. Would you say that's a good summary? Yeah, I think at the highest kind of like 30,000 foot level. That's right. I was many years old.
of work and kind of Ethereum is like pretty lindy like a lot of people trust it's been around a long time and so it's accumulated a ton of trust and security and yeah other people can can sort of benefit from that.
Yeah, of course, that's kind of where our partnership comes in. So maybe this is a great place to ask Jacob, could you tell people how mantle and eigenlayer are working together?
Yeah, so eigenlayer has a product called eigenDA, which is a decentralized data availability layer that is secured by Ethereum, but is a high performance data availability layer. So, you know, back in, I don't know, February,
or March of 2022, I was introduced into some of the talks that Srirama had given and was honestly pretty deeply inspired by those. Srirama had talked about how eigenlayer could secure other
other features of Ethereum, which were not built into the core protocol, but which could still have the same shared security and decentralization derived from Ethereum. So this one particular piece
that the eigenlayer team has been working on, which is, you know, eigenDA really helps us to dramatically reduce the gas fees of a layer 2 beyond what a traditional layer 2 is capable of.
And just to give some context into why this is the case, the vast majority of the transaction fee when it is incurred on a layer
2 goes towards paying for data on L1. And so, you know, the Ethereum L1 was never meant to be used as a data protocol. It's good to degrade many things, but data storage is not really what it was intended for.
So with Icon DA, we're able to break out the data portion of the transaction into a separate data availability protocol and to allow those restaking incentives that Calvin was talking about earlier to be used
to ensure that that data protocol is still secured by Ethereum. So ultimately our goal is to reduce the L2 transaction fees by about an additional 80% beyond what the typical L2 transaction fees are.
That's a great summary, and I just want to point out that there's a really great dashboard out there called the Rollup Economics Dashboard, which is tracking a lot of these costs on L2s. And so people can see with live network data just how much the data availability layer is kind of passed on as fees to users.
Because we're talking about IGNDA, I think it's a good time to have Srivam or Kalvan kind of talk a bit about the maybe the relationship between IGNLA and IGNDA just so people have a good understanding of each and their minds. Yep, let me see if my trailer space works now.
Okay, awesome. Thank you, Cooper. Sorry about the technical trouble last time. Yeah, so eigenlayer is a general purpose platform that supplies the decentralized trust from Ethereum. Stakers who opt in to any service that wants to build on top. What could be built on top?
Any kind of a new service, this could be data availability services, this could be MEV management, this could be bridges, this could be keepers or even driven activation, this could be whole new chains. All of these things are type of services that can be built on top of eigenlearn.
When we look at what is the most interesting service that can be built on Ethereum today and what is most needed today, we settled on data availability as one of those examples. I think Jacob beautifully pointed out like why a data availability
how Cooper and Jacob both pointed out how data availability costs are dominant in rollups as well as how we ended up partnering together. But what EigenDA does is essentially takes the Ethereum validator set as a
security source and then create a new kind of a protocol where whenever a roll up wants to commit data to Ethereum, today they write it into Ethereum, but the reason you need to do it is that roll up offload computation and just submit
proofs, but for somebody else to verify that the proof was executed correctly or to continue executing proofs afterwards, they need to have a snapshot of the inputs to that computation. That is what is being published to Ethereum today. Instead, if we had a separate layer which is supported by the Ethereum
But which can then provide this proof of publication or data availability as it is more popularly called in a much cheaper manner that would be something that is quite valuable. And Ethereum itself in its roadmap has many many
We made many breakthroughs that actually contribute to improving data availability. The most important one is called dunk sharring, which is still quite a bit down the road and intermediate version of it called proto-nunk sharring. Both of these are built out of basic parameters like erasure codes
and KCG polynomial commitments which form kind of the core primitive behind some of the ZK Proops. So, you know, eigenDA takes some of these cryptographic elements and builds a new module on top of eigenlayer.
for providing very cheap data availability. And when we talk about very cheap data availability, I think one view that we take at Eigen layer is instead of trying to evaluate the fees, I think we should start evaluating the cost basis of providing data availability or flipping it another way.
The reason Ethereum data availability is expensive is because the total data bandwidth available on Ethereum is limited. If we can expand the data bandwidth available on our data availability service, then we could significantly reduce the cost. If we could accomplish
And that's exactly what I can do to us is basically expand the data bandwidth without expanding the node requirements. And you know, just to give us a sense of orders of magnitude here, if you use the
Ethereum blockchain purely as a data available resource. You don't do any computation, you don't do anything else, you just write data to Ethereum, then Ethereum can support something like 80 kilobytes per second today.
We are shooting for in our first version is something around 10 megabytes per second. So it's kind of like a couple of orders of magnitude improvement over what Ethereum is today. And as we start getting more and more interesting use cases on Eigenlear like
mental and as demand on these services start going up, we want to scale the data bandwidth proportionately. And so, you know, my mental model for this is just like cloud space. You know, we talk a lot about block space in the crypto community, which basically means like how much, you know, either data bandwidth or computation
and bandwidth is available. And if you just take the data dimension, there is no such equivalent in cloud. There is no concept called cloud space, which says, you know, this is the total amount of cloud space. And, you know, if you want to store more than your out of space, what happens is the cloud stretches.
to accommodate demand. This is exactly the model that we are building for IGNDA is we should keep stretching the capacity as you get more and more demand on board and the underlying technology and the systems theory is already available. We just need to build
better and better engineering for this to be feasible. So that's the core underlying promise of I can DA and the way it accomplishes it is by not requiring every node in the network to download all units of data, every node in the network, no node in the network downloads all data, every node in the network downloads only like
a very small portion of the data, but together nodes have enough views of the data so that even if a lot of nodes go offline, you can still reconstruct pretty much all the data. So that's the core principle of Agenda A, and that's why we can reduce the cost of data availability significantly relative to Ethereum.
Yeah, it's pretty exciting to think about what will happen when you know we remove more and more constraints on the current environment that we're building in. You did talk a bit about some of the features that differentiate eigenDA. Could you go a bit deeper into how you would compare it
to other blockchain data availability services that are out there. You talked a bit about proto-danksharding. Any others you'd bring to mind and how they compare? Yeah, absolutely. So the idea of modularizing the blockchain, which is that there is going to be a
separate layer which just provides state availability and a separate layer which provides, you know, computation which is kind of the core parameters underneath the role of era. These were kind of spying it in the Ethereum community. Some of the members from the community have gone on to build other blockchains, one of them.
being Celestia. For example, is a data availability blockchain. Polygon has also built its own version of a data availability blockchain called Polygon Avail. What is the fundamental difference between these different systems? And of course,
I also talked about darksharing and proto-darksharing which pioneered several basic primitives that are needed to build a scalable data availability system. What are the differences and similarities of eigenDA with these different solutions? The first one, which is
maybe the most obvious one is eigenDA is supported by eigenlayer and eigenlayer is supported by Ethereum's Trust network, which is a fundamentally different thing than oh if I wanted to build a data availability network all these others you know like
Celestia or Polygon had to out of compulsion, you know, create new networks to actually then support data availability validation. And the reason they have to create a new network is prior to eigenlear whenever you have a new idea for a system that provides a certain service
The only way you can do it is you have to go create a whole new blockchain network which means you have a new set of validators, you have a new and usually there is a kind of a new token or something associated with that and you know validators take it and participate in it and you have to have
some kind of a consensus protocol which then maintains this new blockchain and then you have the particular like service which is you know the data availability which is then built on top. This is the architecture that both Celestia and Polygon are will take. Whereas
you know the luxury we have because we are building on eigenlayer is that you know a lot of those things have already been taken care of. For example, the validator network is not something that eigenDA has to get for itself. It's not a new network. It is just a subset of the Ethereum network.
network and the more successful IGNDA gets, the more Ethereum stickers can just turn on or opt in. So that's number one, the obvious starting distinction between IGNDA and other projects out there. But there's also
other levels of distinction. One thing is because we are such an Ethereum centered project, we have this set of validators from the Ethereum blockchain. We don't have to build a new consensus protocol
or an all-drink layer. So what happens in other blockchains, which also provide data availability like Celestia or Polygonaville, is you have a kind of a consensus layer which basically provides the kind of all-drink of all these data availability commitments. They're all entered into like a ledger. And for us,
Because we are building it purely as a kind of add-on for Ethereum and it's just serves the Ethereum roll-ups, what we do is we just have to write these data availability commitments into Ethereum and Ethereum itself orders all these things into a ledger. So we don't have a ledger.
the ledger of our own, the ledger is the Ethereum ledger and all that this committee does is sign data availability or customizations that can then go on top of Ethereum. So this I would say is a very layered architecture because all we're building is one particular unit
of layer which just does data availability. And once you separate out that the data availability engine only does this one particular thing which is receive units of data and then make data availability at the stations, which are then like aggregated and ordered on top of Ethereum, you have
many degrees of freedom in building this system and you can optimize the system for latency, you can optimize it for throughput in ways that you would not be able to when the system also has to do other things like ordering the ledger and so on. So by adopting the separation principle or layering principle, we can actually get a series
of performance benefits, which is difficult to achieve in these other systems. And I mentioned the throughput as one example. We do not know the other systems like particular parameters, but we can continue to track them as they come on board.
One another dimension of eigenDA that we've thought about a lot is pricing. Instead of these existing blockchains, I don't want to specify this only for data availability because these other systems are not at life.
existing blockchain systems are fundamentally pricing only for congestion, which means on Ethereum, for example, if there is not enough transaction demand to fill up the block space, the particular pricing protocol will start decreasing the price down
and down until there is some demand to fill the block space. So basically this is what we call congestion pricing. So you're pricing only when like you know block space becomes congested. But like I was alluding to in the cloud, there is no congestion. You know, you can reserve instances on Amazon or whatever other systems without
having to worry about congestion and in fact people will play a premium for avoiding the risk of congestion. So the idea of eigenDA is to also price things so that you can actually reserve a certain amount of data bandwidth. The very premise of eigenDA is that data bandwidth should be abundant.
But if data bandwidth is abundant, there is no congestion. If there is no congestion, there is no price. And of course, a system doesn't work without price because stakers or validators are putting their capital and operational expertise and efficiencies into it. So they need to get paid. And the way we do it is by having a completely different method of
pricing we call reservation pricing. So you can reserve on IGNDA a certain amount of bandwidth over a certain period of time say six months or one year and during that period that bandwidth completely is reserved for that particular roll-up, say, mantle. And what this does is, you know, it
use a certain price certainty that is simply not available anywhere else. And because the role of gets a certain price certainty, you can then design an economics around this quite efficiently. Finally,
the last, so just to add one more thing to it, it's like when you're running a business, let's say you're running an airline business and you don't know what the price of jet fuel is going to be, it is very difficult to sell tickets six months ahead. But if you have some kind of a deal or a contract or something that allows you to buy
I had a time or you have hedged your price risk in futures markets, then you can essentially run, you are able to price things for the future. And that's exactly what I can do. Reservation does is enables or empowers rollups like Mantral to then go ahead and
create more innovative economic systems that empower their users to say that, "Hey, till this much transactions per second, you know, this can be the price and you can have some kind of certainty around it." The final thing that I can DA enables because it's such a modular system is what we call dual-staking.
dual staking is, you know, when somebody steaks on eigenlayer in general, so dual staking is a general feature of eigenlayer and particularly instantiated for eigendea. The idea of eigen dual staking is that whenever you have
you have to secure something on eigenlayer. Each stakers are putting in their stake and basically they're underwriting the validation. But the only way we can hold them accountable is if there is a provable on-chain slashing event where it's proved
that they must behave in some way and then they can be slashed. But there is a benefit in many systems where if there is another token community which wants to stake and participate in the same type of validation, for example, in eigenDA for
providing data availability, it could be that let's say the mantle, Stakers be the bit Stakers and the each Stakers can both stake and both of them provide assurances on data availability and the rollup contracts consider a data to be available only if both these
corums state that the data is available. So what this does is this opens up a whole degree of freedom for rollups to then innovate on their own economic design. I mentioned the economics and reservation bandwidth earlier, but this is another dimension where the rollup took
can now be used both as a staking token in the eigenDA validation and as well as you know we're also getting the security from Ethereum. So it's a dual staking is dual securities you have like two layers of security or two layers of safety where one one group which is each
take a certified of the data is available, but also, you know, some other group of stakeholders, which are specific to a roll up like Mantle, also certified that the data is available and only then the smart contracts, the roll up contracts, consider that the data is available. So these are, I would say, the dimensions in which I
is different, there's also more, you know, eigenDA is customizable, which means you can customize some of the features or for example, you know, in which algebraic field these operations happen. So there are degrees of freedom in eigenDA that are not available when there
integrated deeply into an existing blockchain. So that's the summary of the differences and similarities of I can get off course I didn't point out the similarities the similarities are basically the idea of using data availability sampling is inherent in all these systems the idea that there is some kind of a narration code is inherent in
in, you know, in downsharing in polygon avail in Celestia, but the exact details of how the distributed system works, who samples data, who needs to download data, all of these details are different and these details matter a lot in the ending performance of the systems.
Yeah, thank you for that comprehensive explanation. Just to recap a few points, it stood out to me. I think leveraging the existing trust network, one of the strongest trust numbers that's out there of Ethereum, it's a huge benefit. Of course, we talked a bit about how we as mantle are able to still have
have the economic design of our own token bit while benefiting from Ethereum security, thanks to dual staking. And then I thought your point about breaking out of the congestion pricing paradigm into something more like modern day futures markets where people can predict costs and pay for them in advance.
That's something that will open up a lot of new activity on top. These are just a few ways that people in the community can begin to speculate of how IGD layer is going to change the dynamics of a lot of the gaps that we see today. I want to double click on it.
is a lot of users are asking, how does Mantle's use of eigenlayer change the security model of a rollup? Are you able to talk a bit about some of the implications if there are any for security and using eigenDA with Mantle?
Yes. So the obvious point here is that you know, security is derived from restaking on eigenlayer and therefore, you know, there is an aspect which is how much is restaked or how many distinct stakers are there on eigen
will affect the security of rollups or you know you can call them I don't know you know some quasi-rollups because they're not really writing data fully to Ethereum but our rollups are basically modular chains that are then built on top of eigenDA. So they
So depending on how much of stake is there, depending on how much of the Ethereum Stakers or distinct number of Ethereum Stakers opt in, which affects decentralization, there is a kind of transfer of security which is different from Ethereum.
So that's the first point that I can be a doesn't inherit exactly the theorem security because it it is conditional on opt n from stakers and and validators. I would say this is the most important difference for in terms of the fundamental security model up but
So this is relative to writing directly on Ethereum, but you can also write to the just a special token committee which is only dedicated to the roll up, like to this kind of like a chain, for example, to just the the manual
will roll up only right stator to data availability committee maintained by like the bitstakers. Relative to that, this model is uniformly better in security because we are able to get a portion of the theorem stakers and validators opt in. And one can kind of like
I try to think about what amount of restaking is it enough to run a chain which has a certain TVL and so on. And you know, these are kind of nuanced technical discussions. I don't know if it's ideal to in the space, but the core idea of the core two dimensions, I think these are distinct dimensions.
On which one can think about security being transferred from Ethereum to IconDA and therefore to the rollup is on how much of economic stake is there and how much of decentralization is transferred which means how many different node operators or stakers are participating in it.
And of course there are reasons and incentives that people who are staking Ethereum will want to be restaking. So maybe, you know, Calvin, do you think you could speak a bit to some of the reasons that stakers would want to become restakers with eigenlayer? Yeah, happy to. So fundamentally,
I think there's probably a lot of sort of angles to this. The most sort of evident one is that if you are an East staker, you're earning yield on your staked East, eigenlayer, and as different services like
I can DA launch on top of it. It means there's economic value flowing through these systems that are built on top of I can layer and you as a staker should expect to receive for providing security, some portion of this economic activity. So put a little bit more simply, you could expect
to receive a portion of the fees in the eigenDAe example that roll ups have to pay through eigenDAe to restakers in eigenlayer, and so you can increase your yield. What's really interesting is over time, we're talking about eigenDAe
top of eigenlayer and it's just one service and it's the first service that people will see built on top of eigenlayer but over time hopefully there's two five ten maybe a hundred different networks on top of eigenlayer and you as a staker can pick and choose which services you actually want to stake into
based on your decision about what type of services you want to support, the risks and rewards for staking into these different services. And so you just get a lot of choice with your stake-eath about what you want to do with it and what kind of economic activity you want to secure with it.
as opposed to right now you stake it just for Ethereum and you sort of get this one yield you'll be able to stack yield in many different ways. So it's really about capital efficiency I guess is the most is the is the term for if I had to pick one race.
Yeah, one consequence of what Calvin just explained is there is a massive theme site network effect, which is as you have other middleware on board onto eigenlear, more stakers opt in and opt in to serve a lot of these middleware and what it means is
As other middleware on board on to eigenlearn, all of them start validating also for eigenDA, which then increases the security of these different services. And if you think about the economics of staking today, staking economics is primarily dominated by the
cost of capital rather than the cost of operations. Just to take an egregious example, even something like Salana, which people say takes a lot of computation or operational expense, each
node may take like 10k dollars annually to run and there may be thousands of nodes and that really only contributes to 10 million dollar in like operational expenses every year to run the entire Salana network but the capital cost of staking which is you know there is a huge amount of
of stake on the Salana network, how much APR are they expecting? This is actually far dominant. This may run into like hundreds of millions of dollars or maybe even into the billion dollars depending on the total value of stake. So what EigenLayer does is,
by reusing the same capital across a series of different operations, amorphizes the capital cost and let people opt into a variety of different services. One thing we think about a lot in EigenLayer is since we are so ethereum aligned, one of the important features of ethereum is its decentralization.
And the idea that when you have a lot of these services, we don't want at least some of the services which bring the most yield. We wanted to be built in such a way that even solo operators or like, you know, rocket pool operators or the decentralized nodes can still participate in. And this is the principle
with which eigenDA is designed so that the additional node requirements for running a partisperying and eigenDA are set to be very low relative to what is needed to run an ETH 2 node. So that's one of the underlying principles of design of eigenDA.
So, of course, go ahead, Calvin. One sort of like macro consideration that really interests me, like in a world where eigenlayer exists and there's many services built on top of it that Staggers can exist, can choose from, is like if you kind of look at the
the way that a new service is started and sort of recruits a validator set and the way that a validator or a validator firm discovers new services to validate for, there's not a lot of
structure there. It's there's not a sort, there's not like a very sort of smooth distribution and sort of acquisition channel. And so if I can layer can become a marketplace that connects stakers and validators with services that
need them. I think there's a lot of really interesting kind of efficiencies on the infrastructure and operation side and like distribution and acquisition side that will arise out of that as well. Could you tell us a bit more about some of those efficiencies like what specifically
do you think might arise? I think, I guess there's kind of several angles on it. When I think about it from strictly a business angle, I think about like Ben Thompson's aggregation theory and just about how a lot of the most
sort of influential platforms in Web 2 on the internet. The reason that they're so influential is because essentially they become this place where supply meets the man at a very simple, they either like control all of the supply or they
control, not control, but either they aggregate all of the supplier, all of the demand into like a single platform. And actually, like a lot of what's interesting about crypto is that you can build systems that sort of perform have the same, have the same potential from a business
perspective but are actually designed as like public goods that are owned by many many people instead of a single corporation but there's still like a lot of efficiency that you can derive as a platform from like aggregating either all the supplier or all the demand in a market and so if I can layer it does a good job
If you need validation, you only need to go to one place to get it. If you're validator and you want to look for opportunities to stake and validate, you only need to go to one place to get it. That's kind of the perspective I have on it.
I guess from my perspective, we're probably, at least I'm a little bit early thinking about the actual nuts and bolts and the type of infrastructure that could exist that would make it more efficient for validators to discover opportunities or services to distribute.
tasks to validators. One more to add to that point is, you know, if you look at why we started Iconlayer, you know, as an academic, you know, I was running the, you know, your Washington blockchain
research lab where we did a bunch of research on consensus protocols and scalability. And one of the common thesis for the entire crypto landscape is the ability to do permissionalization. And the ability to do permissionalization
innovation basically means, hey, you don't need to know who I am or where I come from or anything like that. I can come and create a new service or a smart contract and you don't need to trust me, you just need to trust the blockchain and I can borrow this trust from the blockchain. But this is the core
values that is dominant in the crypto ecosystem. However, building infrastructure is anything but permissionless. Suppose you had a great idea for how to improve the Ethereum consensus protocol. Suppose you had a great idea for how to improve the Ethereum darksharing protocol.
You have a great idea for how to make a great bridge or a great oracle or you know a great new chain. The only way we had till now is you have to go and start a whole new community and align people around it. This is what I think Calvin was alluding to when he was saying that the process
is ad hoc when you want to go find validators. It's not only that you have to find validators but the way you are paying validators is in kind of, you know, hey, buy and stake my token and then I have to pay inflation of the token and each new service needs to do this and the amount of seeking
security that you get is just proportional to how much is taken in your service. So one way of thinking about it is this is like the security is built as islands like for each service there is an island of security. But one thing we see from the real world is that, you know, we don't have like city states, we have like nations
We even have Transnation security like NATO and the idea is that security is much better built in the aggregate rather than in isolation. And when all these services instead of having their own like stick capital and valid networks, if you have like a common
and pool of security that is shared across many, many different services, we're just all in a much better place. And the most important consequence is now that it's permissionless innovation to launch new services which are at the scale of the distributed system. I think this is one of the things
at least is my theory that one of the reasons for the greater progress in crypto infrastructure can be improved quite a bit is when you have permissionless innovation in crypto infrastructure and that's really what eigenware powers.
Yeah, it's a great model where you can make an analogy to in the past when every company would run their own servers in house. And now of course that idea is totally outdated.
And this is a lot of great opportunities for new DAPs, lower barriers to building, you know, which is super exciting. One thing I was hoping you could double click on is explaining a bit about how
eigenlayer is going to enable kind of the bit staking side of things. So you mentioned the dual quorum, dual token. Could you talk a bit about what will happen on the bit side for mantle never to enable that to happen?
Yeah, absolutely. So the core idea of dual staking is that there are, you know, two different token stakers, like one is each stakers or each three stakers. And the other one in this case, in the mental case, is bitstakers. Both these columns of
nodes are basically certifying that data is available. So I'll double click a bit into the details. So what happens is just like on eigen layer nodes restrict ether and they're able to participate as a validator for the eigen DA process.
And in the same way, there will be a special contract on which people can stake their bit. And when they stake their bit, they are now participating in the eigenDA protocol and they communicate their express interest to participate in the data availability for Mantle, then they download and run the eigenDA software on
off-chain and when they run download and run this agandia software off-chain what is actually happening is whenever there is either the the stake of themselves run it or the the ask the delegate or an operator to actually run it and this is standard in all kinds of blockchains.
And so the operator what they do is they're basically downloading and running this version of I can DA where they're downloading portions of data and submitting certificates. And the certificates are then aggregated by like the roll up sequence or the mantle roll up sequence or which will then aggregate all of these things and put a certificate on
to Ethereum contracts, the IconDA contracts which verifies that both the bit token nodes have received their portions of the data and the Ethereum re-stakers have received their portions of the data. And we get an attestation on Ethereum that actually all of this has happened.
And the the roll up can therefore move ahead to this new state update. So in practice, what's happening on the bit side is you have bit token holders who are sticking the bit token. They're appointing a delegate or an operator or if they're so interested running it themselves.
And then they are participating in this eigenDA validation off chain and then they are sending signatures when they receive the data to a common mantle sequencer which will then aggregate all of these and put it on Ethereum to show as an attestation for data availability.
Yeah, thanks for that explanation. I think that's a good understanding for everyone of the nuts and bolts of what's happening under the surface and what this leads to is in a lot more bandwidth on the data.
availability side lower cost for mantle. So a question for Jacob would be, could you tell us a bit about some of the use cases that were are going to be possible on mantle with these lower costs and with this increased bandwidth for data availability?
Yeah, so we've been specifically trained to target use cases that weren't viable in traditional EVM networks. So, you know, specific things with gaming, you can bring a lot more on-chain and have a lot lower transactions.
in the paradigm that we're working on together. It's also going to be true with like social graph things or I don't want to leak too much alpha but some of the stuff that we're building at game seven including the hyperplay product have some specific features
that leverage the mantle chain and that are going to do some pretty outstanding things with NFTs that are really different from what anybody has done historically and probably wouldn't be economically viable in the previous paradigm.
The gaming and social graph stuff is really just sort of the beginning though. There's all kinds of things that we want to enable. It persists beyond eigenlayer as well, but with Mantle, it's a pretty high priority to
To really look at some of the best EIPs that haven't been able to be adopted yet at the L1 level and to be able to adopt some of the EIPs earlier so that we can enable better use cases.
we combine those with either agandia or other forms of extensibility. We want to be a bleeding edge EVM chain in layer two for Ethereum that is
You know, that really greatly improves the user experience and the capabilities of what blockchains are capable of doing. And could you share a bit about some of the IPs that you're excited about and what you think that'll do for builders or users on top of an network like Mantle?
So our Soziou House hacker house that we did, we actually ported the
EIP 3074 to an OptimizterCrolup framework. For anybody that's not aware, EIP 3074 is a EIP that allows that allows meta transactions
and contract capabilities in general for externally-owned accounts. So this is even before account abstraction existing wallets without having to migrate to new accounts can get access to meta transactions, have their gaspied by
third party, have it paid by the adapter developer, have it be paid for by a friend, do subscriptions or delegation of the rights to manage funds to espouse all of those kinds of use cases that are currently
It currently require contract accounts or require people to migrate to account abstraction. 3074 could enable a lot of those capabilities right away. And so we've been interested in implementing it in in mantle and
We've also been interested in adopting 4 through 3, 7, which is the true long-term account abstraction, which is going to take a longer time to migrate existing users to, but is insanely powerful in a massive user experience accomplishment.
So, I mean, that's one area. There have also been some really interesting EIPs that the Arjun who leads growth for mantle has been
been looking at especially around NFT experiences like PNFTs and EIP2981 or EIP4910. So those are focused on improving protocol revenue for NFT creators and
and really creating incentives for creators and things of that nature. So I think that we're likely to be a place to really be on the bleeding edge of innovation and things of that nature. I don't want to promise particular APIs right now, but they're a high priority for us.
Yeah, and I think that that's one of the benefits of the modular approaches that we get to focus on what environment is going to be best for builders and users. What can we integrate to make that happen while relying on, you know, a team like I can layer an I can DA who's focused
is on running a best in class data availability service. So the specialization you can really see the benefits early on. And you know modularity definitely is this new way that we're excited to be bringing forward
I think that's going to be the last question. But before we wrap, I just wanted to ask any of the speakers if there's something they want to share about an update or an event that they didn't get a chance to share during the AMA. >> I just want to add a point before we wrap up on, you know, I've been very excited about this.
partnership with Mantel because of the vision for actually how much scale they want to accommodate on a blockchain. This is something we've been very excited about and passionate about and you know Jacob alluded to game 7 and gaming applications and we discuss about this a lot that there is
a scope, massive scope for things like on-chain gaming, much better gaming interfaces and user experience so that the crypto assets can then be living on a blockchain. And the particular structure that is ideal for much of these gaming experience
is that of a rollup because you can get things like instant confirmation optimistically and also while also getting the security packed by Ethereum. So we were excited to partner with Mantle in this vision.
Yeah, I was sort of add a little bit. What we're working on is obviously we think it's exciting and it's novel and it's sort of hopefully pushing some of the the frontiers of crypto forward, but we need partners like mantle who are also that
for looking and willing to build together with us into a future that hasn't been done before. So we feel really grateful to work with the mantle team and the bit-to-team as well. And I think, yeah, even past IGNDA in the future there's going to be
many opportunities to collaborate together and sort of mutually beneficial working relationship. So yeah, thank you for working with us and for having us here. In terms of things to kind of pay attention to, this will be a big year for
For Eigenlayer, we will have a lot of product releases and announcements coming out. I think right this second, we follow us on Twitter, of course, at Eigenlayer. And that will probably be the best place to get updated on everything that happens as we continue to launch.
places for our community to gather and discuss ideas and as we announce like our future product launches and things like that as well. And of course that's entirely likewise. It's great to know work with teams that have such a lot
And please join us in our telegram and discord. There's always conversation there. If you're building a gap and looking to deploy a gap, we are actively supporting developers that are in our telegram and our community. So there's a lot of opportunities to get engaged, a lot of opportunities to learn, and of course, experiments on a lot of these cutting edge.
networks. So thank you everyone for listening. This has been amazing. We'd love to see a wave of emojis thanking our speakers for their amazing insights. And please look forward to the next AMA next week and other events such as Mantel's Happy Hour takes place in Discord. Thanks everyone. Hope you have a great rest of your day.