Community Call - What's Next in 2024

Recorded: Feb. 2, 2024 Duration: 0:54:19

Player

Snippets

Hey, we'll get started in just a second. Let me
make Justin a speaker here. Thanks for joining in.
Might be quite a quiet today as we just sort of announced the Twitter space a little bit on the late side.
Okay, if it's just a couple of minutes here and we'll get started.
Need to have some of that jazzy like hold music going on as we as we get started. But thanks for thanks for being patient. Sorry if you don't have any jazzy hold music.
We'll get started here in just a minute. We just kind of got the word out there for for everybody to join us here and going to talk a little bit today and kind of discuss with the community love to, of course, have everyone's voices here or if you don't feel like talking, you know, you can shoot a text in the Twitter or in Twitter, you know, you can do that.
And kind of discuss with the community love to, of course, have everyone's voices here or if you don't feel like talking, you know, you can shoot a text in the Twitter or in discord and show a little bit some of your thoughts.
I think we're going to we're going to try to cover some of the ideas around what the community has proposed the fees, some of the dynamic fees, some of the new feature stuff that come with the auction module how that's going.
How that maybe could evolve. And, you know, we just actually heard news of crescent winding down so that could also be potentially a topic of discussion. But overall, just sort of a casual get together, listen to the community and
kind of talk about some of the things that have been proposed in the in the discord and governance channels, but welcoming Justin to the to the Twitter spaces here. Justin, if you want to share what the Hawk Networks and Althea team does to support gravity, that'd be great.
Hey, yeah, so the the Hawk Networks team has been selected by a governance proposal. I don't have the number off the top of my head, but should be pretty easy to find to provide, you know, general maintenance feature implementation and, you know, managing updates that sort of day to day work and
And the most recent thing we did was supervise the deployment of the auction module which was developed in via a separate governance proposal with notion.
And the the other that that proposal also so supported the block scape team and doing one of the user interfaces for bridging in and out
And many chains with gravity make their own kind of interface. I think that you think name does a canto has gravity sort of interface on boarded and
But for those folks that want to use an independent bridge interface. There is bridge blockscape network and those folks also got selected and then the Chandra station team who may join us a little bit later.
Also, we're selected to support some of the analytics sites and look, what are all the things that Chandra does Justin
Well, Chandra has, let's see. So they have a little running a fork of space station that zone that has been updated.
That was the original Cosmo station front end that predated even blockscapes and that's now running on their own gravity pulse gravity pulse that
One second. I got to remember that one.
Yeah, we can we can also jump some of these links into the just for folks to be able to find afterwards to
Yeah, so Chandra runs the the an alternate generic gravity bridge interface on gravity pulse dot app in addition to a statistics page that shows information about bridge volume and collected fees.
And then finally, they have developed an auction module interface, which is at auction gravity pulse dot app.
So together the, you know, and and I think it should also set examples for other independent organizations that want to support gravity bridge, you know, maintenance and growth that there's a sort of a template for
You know, launching a governance proposal in front of the community. Typically, it's it goes through the process of the Commonwealth, which is a forum.
You talk about it with the communities discourse and community calls and then onto on chain, but it's certainly not a gated process or a permission process for any way and we encourage everyone to, you know, if they find a good fit to support gravity to get one of those proposals up and
And moving forward. So, but exciting to see that there's that we have such a great diversity of entities that are supporting and maintaining gravity.
Gravity launched in December of 2021. So this means this is the third year of gravity's bridging away and
Justin as you often say that the goal is to make it boring.
And that the bridging happens in a very innovative very boring and routine way as a piece of infrastructure should be I would never want to be driving in my car and and have crossing a bridge be exciting.
It should never be that it should be boring and a piece of infrastructure and gravity center really amazing job of being that routine.
You know bridging interface that's kept a lot of really exciting chains and applications moving forward.
And one of the things that that that was big this last year was the auction module. I didn't know for folks that are listening now Justin should we just give a quick recap on what that does.
Yeah, so I think to recap the sort of history here gravity bridge was first deployed with no fee mechanism and within you know I think it was several months of deployment. It wasn't quite right away.
The community decided that charging a fee on bridging out of of gravity bridge would be the place to you know try and try and capture some revenue.
And a very simple fee mechanism was implemented that just took the fees that added a second fee to the bridge out message. You know there was the fee that was paid to a theory and read layers and then the fee that was paid to the chain.
And this was directly sent to validators and stakers.
And that is how it worked for I think a little bit more than a year before the community sort of.
There were some rumblings and some discussions mainly around there are a lot of not very active stakers that are not even claiming these that were not necessarily claiming their staking rewards.
So the next step was the auction module which takes a share of these fees and auctions them for graph and this provides a way to sort of more efficiently use.
To sort of more efficiently use those funds and so in both cases. These are entirely community driven proposals. So a community member submitted a governance proposal. It was approved and in the case of the auction module.
A governance proposal for the feature was created then a governance proposal to pay notional to develop the module and then finally the hawk team helped integrate the module and launch it you know which lines up with our aforementioned sort of maintenance maintenance proposal that we're currently doing maintenance for gravity bridge.
Yeah, well, it's interesting to kind of think about over that whole history.
And I think one of the things that might be interesting for the discussion today is the the how the implementation of oracles could potentially allow some different parameters around the fee module going forward for the community to discuss.
I don't know if we want to tease that out of it Justin and so the community can kind of think about what's possible and you know how they might want to move that forward.
Yeah, well, I always find this kind of kind of interesting after the last update the question of what's next is never really answered. There's not a roadmap. It's very community driven and proposal driven.
But stuff always ends up showing up and ends up showing up in the conversations and after a few months. It's typically pretty clear what the next update is going to look like.
Even though, you know, maybe the proposal isn't quite up yet. So I always find this a really fascinating process to be a part of.
So since the deployment of the auction module, the questions were, well, how efficient will it be? Will enough people participate?
You know, will it really work well or will it be disruptive in some way? And the answers to all of those have been pretty positive. The auction module seems to be working well. There are plenty of participants in the auctions.
And, you know, everything is sort of going off right on schedule.
So now the next question, the sort of community, the sort of community discussion has turned to really two different topics. And one is that noticing that gravity's volume is the composition of gravity's volume is sort of changing.
Earlier on in gravity's lifespan, it was there was a lot of stable coin traffic and maybe it was 60, 40, 70, 30 stable coins versus ERC 20 representations.
ERC 20 representations. So this is where projects that want to bring the ERC 20 from Ethereum to their own to their own dap chain or they want to bring something from a from a Cosmos SDK chain to Ethereum.
And gravity bridge is still the only permissionless chain to do this. And so this is taking on a larger and larger portion of gravity's volume.
Which is kind of timely because it means you're getting a lot of odd tokens. If you are a gravity staker and you are collecting fees, you're getting a really odd sort of basket of tokens.
And this has led the community to think more about what tokens should be directed to the auction module and what tokens should be directed to stapler's balances.
And maybe some sort of feature where the community can decide, hey, you know, all of these tokens need to go to the auction module. These tokens I want to these tokens we want to go into user balances.
And that sort of control, for example, would allow a lot of the dust tokens to be swept into the auction module where they can be dealt with more efficiently because it's sort of difficult to deal with.
You know, one one millionth of a cent of sheep token.
Which is a very real situation for a lot of gravity stakers who have collected a certain amount of that and various other obscure tokens and fees.
The other major interesting feature is that the auction module provides a Oracle and by that I mean when the last auction for a given token completes, we now have an exchange rate for it.
And this is really important because it makes it possible to do something that semi centralized and centralized bridges sort of take for granted, which is volume based fees.
You know, previously, due to gravity bridges permissionless nature wasn't really practical to develop to develop any sort of volume discount system for the fees because you'd have to define what what a given transaction was worth in order to know that.
And given that it could be any sort of random token, including one you've never seen before.
There was really no practical way to get the exchange rates.
And but now with the auction module, we have a local source of truth for that sort of information, which would make it possible to provide a volume discount or and that could work in one of several ways.
The one that I've sort of been hearing about is that you get a
Is that you get a discount if you pay the fee in grab or something similar to that, which would then allow you, which would then allow people who are bridging, they would essentially put down let's say
50% less grab something like that. Obviously, the parameters are up to governance, then they would otherwise pay in USDC or ETH or NIMM or whatever token they happen to be bridging for the chain fee.
So yeah, I think that's generally the picture of the next of the next upgrade.
It seems that this is what the community is interested in. And now it's just a matter of timing logistics and implementation, which are all important things that I'm sure will be sorted out over the next couple of months.
Yeah, I think that was a great explanation.
Again, if the community has any questions, feel free to raise a hand or drop those in Discord or anything like that too for more clarity. And of course, Justin and many of the team is available in Discord a lot of the time too, so follow up there.
Yeah, I think that makes a lot of sense. I'm not necessarily sure that I have any questions around that. That seems kind of like the next step. Slava, I'll get you approved here and you can ask. Thanks for joining us today.
Slava, it looks like you're a speaker. Please feel free to ask your question.
Hello, can you hear me?
Okay, next. Hi, guys. I'm Slava. You can know me in Discord. I'm usually, I may say I'm a validator in gravity and OG of gravity. So I'm actually the one who actively use the auction model.
As I know the main point of this, it was to implement in gravity some kind of burning mechanism. And I think it's doing great because for me, it's fun to try to make a beat in the auction and get some rewards.
But as Justin just said, that it's some kind of pricing or Oracle, like for what amount of my graph tokens I would pay for, let me take a look in current auction.
Like for 23, use the C, use the T, 23, use the T right now. But right now it's, for example, use the C token auction ID 169. It's like right now the bid is 10k of graph. So the revenue will be about 20.
Use the C tokens. Okay, it's fine. But sometimes it's caused the situation when it's almost unprofitable to make a bid and
And that's it.
I think, I think what I think, I think the next step of implementation, it's why at least it's some tokens like Kujara, like Ejective that the amount of this token could be worthwhile to spend my time, to waste my time to make a bid on these tokens.
As Justin just said, that's like 50 cents.
Yeah, the original, so first of all, the auction module, I was kind of expecting it to be mostly bot driven, just because it's pretty easy to handle programmatically.
And yeah, you, you, you really need sort of bigger amounts of everything in order to make it worth a person's time bid, but also auctions are carried over.
So the amount of token, the amount of a specific type of token is going to continue to build up until it's worth somebody's time to bid. The fact that these amounts are not building up indicates to me that somebody is going through and probably bot bidding them when they are barely profitable.
So that's, that's, that's sort of the intended design.
It was never really, unfortunately, anything like this is always going to end up being bot driven, which is really what's best for gravity as a whole, because it means that the auction module gets pretty efficient pricing.
But it might not necessarily be best for people who are trying to participate without a bot to assist them in bidding.
So, and next I read you said that you're going, we are going to implement this kind some white listed tokens here, because to increase the amount of oceans.
This concept and how I think it would work. And obviously the community has not yet really spoken and narrowed down exactly how this would work. But, you know, the idea that I've been hearing is that you would white list, let's say your common stable points at ETH.
So USDT, USDC and ETH, and these would go into Stakers wallets. But the rest, you know, all of the other types of tokens would be go to the auction model.
So, for example, you were talking about the amount of Kujira being kind of too small to bid on. Well, it would increase the amount of Kujira that ended up in the auction module to make it more reasonable to bid on.
Which is good because Stakers are not likely to take their, you know, very thin slice of a already relatively small amount of a token and be able to efficiently use that or go and swap that.
Yeah, because I always look at the auction. Right now it's in my account, it's 12th ocean. And in the first time, like Kujira was a worthable amount of tokens.
And yeah, USDC, USDT and ETH tokens, they are, as I see, the bridging amount is enough to be spread across the Stakers, I think.
And the small r is good, in my opinion. It's worth to some, maybe we should set some threshold, like if the bid, like about $10 worth or something like that. It just does.
Yeah, well, I mean, so that's what the min-bid parameter is for. And that's how we originally tried to set it. We tried to set the min-bid to be $10 or something.
People kind of like the fast bidding at the end and the community ended up putting up a proposal to lower it. I'm not really sure how I feel about that, but obviously that value can be adjusted by the community to try and make it such that the auctions,
or such that the amounts build up a little bit more before their competitive bid. Right now, the minimum bid is very low, and clearly somebody's going through and clearing these things out.
As soon as they make any sort of, as soon as they're profitable, or maybe even before they're profitable, I don't know. I haven't really dug into the bot behavior other than to try and make sure that the auctions weren't underbid too much, and that seems to be the case.
You know, so it seems to be working pretty well. But yeah, I do agree that a higher min-bid may be healthier in terms of making it practical for humans to participate and not just bots. And that's a parameter anybody can submit a proposal to change.
Well, my point is, I agree with you, but I'm mostly about that we should accumulate some tokens. Not in every auction, we should spread all tokens in current auction, like, as I said, Kujara could be accumulated.
Kujara could be accumulated some amounts, and then to be spread to be acceptable in auction. Oh, then that's it. In my, I would ask you, higher than the value of the token being auctioned, nobody's going to buy it. Somebody could buy it, but they'd just be wasting money.
So higher min-bid is the solution here, and then the UI probably needs to hide things where it doesn't make sense to bid on them, just to make visual noise.
And may I ask a little side question about the ocean? Because it's some kind of bridging problem. You know, there is a composable and they try to implement IBS in it. So can you tell me a few words about the ocean?
I don't know. Compuration, composable, and their IBS smart contract implementation with IBC, right? Yeah, but yeah, composable's IBC is smart contract. And this is a really interesting stuff, which I can talk here often.
Yeah, it also used to be validator there too, but just want to know your opinion as one of the person who is designed of current orchestrator.
Yeah, so, okay. There's, so keep in mind, we have re-bridge, there are two flows. There is the flow in it. So there's the flow from the Cosmos SDK side to Ethereum, and the flow from Ethereum to Cosmos SDK. And this is the flow of information, as well as tokens.
And so the composable IBC implementation, there were two parts to this flow. There is the part where they have an Ethereum Lite client on the Cosmos side, and the part where they have a ZK or very efficient bridge implementation of IBC on the Ethereum side.
Now, what Gravity Bridge's original design was, and still is, is to be as close as possible to an IBC client implementation on the Ethereum side. So the gravity.soul, the gravity.soul solidity contract is very similar to an IBC client.
It's just a very specific version of one that doesn't match the IBC protocol, because things tend to get very computationally inefficient for Ethereum.
And this comes down to, first of all, that Ethereum signatures and Cosmos signatures are different key types, which means that on Ethereum, you can use the standard Ethereum signature type, and it only costs a small fixed amount of gas to verify.
On the other hand, if you want to verify a native Cosmos signature, suddenly the amount of gas you need goes up exponentially, because instead of using intrinsic, you are implementing in the EVM a solidity verifier. So that's a whole program.
And this is one of the things that would make verifying the entire validator set, as an IBC channel update typically does, really stupendously inefficient and too costly.
And this is where composable gets into the sort of zero knowledge proof bridge design. They need to reduce gas costs, because they're trying to do something that is directly IBC compatible on the Cosmos side without requiring the validators to produce different signatures, which is one of the main things that gravity does is that it requires the validators to produce and submit Ethereum signatures.
And those Ethereum signatures can then be very efficiently verified.
So, long story short here, to keep this at sort of a high level, the problem with ZK proofs is that they take a long time to compute.
And depending on whether or not you open source proofmaking software, it can act as a point of centralization. And this is one thing that I really want to, and even if you do open source proofmaking software, it can still act as a source of centralization.
When it costs several dollars in compute power to make a proof, so you have to go and spin up a machine, etc, etc. It's not sufficient to just have enough ETH to pay the fee.
You must first make the proof and then submit the transaction, which is cheaper, but not, you know, you have to keep in mind the holistic cost, including compute power.
So, fundamentally, the Cosmos SDK through Ethereum flow doesn't change that much if you have this zero-knowledge proof IBC-like client.
The security isn't actually improved because now you've actually added a new vector for vulnerabilities in this ZK proof generation.
You now have to maintain this ZK proof, you know, this ZK proof infrastructure, and the smart contract that's verifying these ZK proofs is also more complicated.
So, you've kind of generated a bunch of problems for yourself, and it's meant a lot of development money, but not necessarily made something that is superior from a security perspective, or necessarily superior from a final cost of operation perspective.
Although, under ideal cases, it could be cheaper to do a single transaction. Nothing can really be batching by the time it's done.
So, you know, if you can batch a sufficient number of transactions, everything is pretty close to similarly efficient, you know, it all gets pretty close to the same.
Now, the other direction, trying to bring information from Ethereum over to the Cosmos SDK side.
Now, this is the part that I would be interested in and think would be more valuable from the perspective of what does a light client for ETH2 look like.
Unfortunately, for all of us, the answer is, ETH2 light clients really don't exist in a meaningful way, and you can't necessarily make one.
And the reason for this is actually fairly intentional.
Beacon chain consensus, ETH2 consensus does not have considerations made for efficient light clients in the same way that Cosmos consensus does.
Cosmos consensus has specific properties and emits specific signatures to make IBC possible and efficient.
And what we did with gravity is that we made the gravity module emit specific signatures and follow specific constraints in order to keep the gravity.soul contract efficient.
What nobody can do, well, nobody except the ETH team, or particularly the ETH community, is make ETH2 do these things.
And the reason is that they don't want to.
They have a strong incentive not to, because if they emit these sorts of proofs, it would make life for bridge developers easier and make bridging off of Ethereum easier, which is not in their best interest.
So long story short, there are no good ETH2 light clients, so we can't really make a significantly better way to move information from ETH to the Cosmos SDK side of the bridge.
Now, if composable built a really advanced ETH client, because I can't really call it light at this point, for Cosmos SDK, now could in theory provide better security, but it would also make running the chain that was using this bridge effectively the same as running one program,
which was ETH2 plus composable chain, or whatever chain you want to call it.
You know, this is the reason why right now you need to run your own ETH node and point a program at the ETH node when you are running gravity bridge, or really anybody else's bridge that's going to Ethereum, because there is no, yeah, there's no light client.
And this is also where like polygons zk-evm and all of this, these things are innovative in part because as the as the EVM executes, they emit the right proofs that you need to make a light client, because all zero knowledge proofs are really proofs over an existing light client.
And that is why, like a zk-proof of Ethereum state for a bridge is actually something you can't really do, because you face information availability problems, there's no signatures, there's no information that can tell you this is the actual state of ETH2 versus this is the fork of ETH2 due to lack of finality.
Okay, I'm sorry, this is this is all kind of complicated.
But long, very long story short, I don't think that what composable is doing is a very serious improvement on the state of the art in terms of security.
Theoretically, they could get lower operational costs than gravity bridge for individual transactions, but they will be doing that at the expense of a lot of development time.
And with that development time, you always adopt additional bridge risk, you know, that there's a flaw in the implementation, not that gravity bridges risk for no bridges risk free, and you should never treat them that way.
But the more complicated things get the riskier they get as well.
So as a general rule of thumb.
So this is why I am not particularly it would certainly be really great if we could have a proper IBC EVM contract.
And in fact, with some of the things that are being done with like the zk EVM, etc.
You couldn't do that on those channels, but you can't necessarily do it to Ethereum.
Okay, I think that's the best summary I can give of this situation.
And far as ways you could improve the design of gravity bridge, you could take all the stuff that is currently in the orchestrator, and now move it into ABCI plus plus within the gravity binary itself.
And this will be less failure prone in a lot of ways, you know, validators wouldn't get slashed for not submitting signatures because it would be almost impossible to fail to do so, unless you modify your binary.
Whereas right now if your orchestrator crashes that can, you know, a validator could be punished for not submitting signatures, due to just, you know, not not watching things carefully enough having a process crash, etc.
So that's definitely an improvement, but would it be a fundamental design security improvement to gravity bridge, not really would be a logistics improvement and logistics improvement should never be discounted.
But, you know, it wouldn't be fundamentally superior.
Okay, thank you. I have one more question but I don't want to ask it because I think the answer will be.
As long as I think I think the dialogue is great. If you can.
Okay, I'm wondering, to be honest, why we don't have a Solana side on for our bridge, like because the Solana community is one of the actively growing. So, just, if you have if we have EVM solidity contract, can we do the same on.
You know, there, there is the rest.
I don't know what the language is in Solana. So, yeah, that's, that's a good question. Fundamentally, design wise, you can take the gravity bridge design, you can have it work on multiple chains, you could do, you know, all of these things are possible.
This is a question about trade offs and operations. So, first of all, I think a bridge to Solana would actually be a little bit easier because you wouldn't have to worry about gas optimization so much, of course, you do end up having to worry about gas
optimization a lot because what if you're wrong and Solana gas prices go up in the long term, because gravity bridges still, still, as far as I know, the only non upgradeable contract deployment for any major bridge, and I'm not just talking in the top 10 I'm talking
in any of them. I don't actually know of any others that are not actually just gravity forks. And this is a big deal because we kind of bet on, you know, with the original development of gravity bridge.
We said, Okay, we're going to keep the contract simple, we're going to do it right. And we're going to not need to maintain any centralized control, nobody, you know, that's not gonna happen.
And this is this this this certainly seems to have panned out well, at least for three years and hopefully many more.
But trying to support a lot of different chains on gravity, sort of saddles the gravity community with the additional development costs of multiple chains, and you can see how that works out for a lot of these big multi bridge providers where they try and bridge to
and from everywhere. And pretty quickly, they start to lean heavily on centralization to deal with some of the problems of running a multi bridge like that.
Because what people want to do with multi bridges gets weird. Let's say somebody wants to bridge from to bridge like something.
Well, I would say let's say USDT and they want to bridge USDT from Solana to Ethereum by a gravity bridge. Well, that means they're locking up some USDT over on Solana.
Then they're moving over and they're trying to bridge it out of the Ethereum bridge. Well, if there's not enough USDT in if there's not enough Ethereum USDT in that bridge, well, then the user can't bridge out.
So multi bridge projects end up with some sort of centralized token balancing. Somebody needs to be able to move some USDT around to make that work.
The amount of additional features and problems you encounter and potential vulnerabilities and issues as you continue to scale up chain support is really, really hard, really hard to solve.
And the bigger multi bridge projects, you know, as we saw with, oh, come on, what was it?
Wasn't multi bridge? No, something. Anyways, you see a lot of these bridge projects that end up being almost completely centralized in their operation.
Because running a big bridge between multiple chains like this is a complicated enough operation if it's just, you know, like if you don't have to make it decentralized.
And that was our biggest 100 percent major focus with gravity bridge is to ensure that bridge operation required absolutely no involvement of centralized parties.
And this includes everything from validation, which is permissionless to relaying, which is permissionless to Oracle information, which is permissionless.
All of the little too smart contract upgrades, which don't happen and therefore are permissioned.
So all these different touch points where there were where there's typically centralization and bridges we managed to avoid.
And that's harder and harder to do as you add more bridge destinations.
So, yeah, this is this is what has made the gravity bridge protocol the most for bridge protocol after IBC, because it's very because it is as low maintenance and maintenance free as as decentralized protocols typically are.
You know, for example, IBC, even the relaying for IBC is more complicated than the relaying for gravity bridge.
But yeah, IBC is not a protocol that's very maintenance. It's not closed source with a million different components.
And that's what's made it useful to the wider world and many different communities.
So, yeah, to answer your question, trying to add a Solana bridge is definitely possible.
The question is really if gravity and the gravity community really want to try and go after this sort of big multi bridge market.
And from what we've seen, if you take a look at the numbers from Axel are wormhole for many of these other places, some odd, you know, 80 percent, 90 percent and that is shrunk some maybe 70 percent of their volume is still on theory.
And that and especially since gravity bridge and the community has chosen to collect fees based on volume.
It doesn't make sense to do a loss leader project for less.
You know, if Solana was doing 100 X the volume of Ethereum, it would make a lot of sense to go and add it as a second bridge destination accepts a more risk probably accept some other compromises rather than centralization because you've got to accept some.
And then you reap that reward of more volume.
But that's not what's happening for any of these other chains. They're taking these other multi bridge projects are taking a lot of, you know, a lot of VC money and using it to develop these extra integrations to different chains and then not really making the
As far as I can tell, at least not really making a return on that. So I've never particularly wanted to present to the gravity community like hey we should do this because I personally don't think it makes financial sense for gravity.
And I don't think it makes a lot of sense for.
I don't think I don't think a lot of these multi bridge projects make sense, other than in the context of typically communities need a bridge and they come to the project, rather than them operating as something that is independently viable on users.
I say, when we're talking about bridges, I have two more questions.
Well, the main idea when the gravity bridge started is to bridge the theory of liquidity to cosmos. And now we have noble noble and what now.
And what else on the ICS. What's neutron. It's a novel cover cover use the tea and noble use the sea.
And don't you think how we can compete with right now it's original cosmos based liquidity.
Because as I as a part of community gravity bridge so I just want to increase the volume and increase the attraction to gravity. But now we have some kind of native tokens.
And what next.
How will compete with it.
Yeah, well I actually mentioned this a little bit earlier and this is what you'll notice that the gravity volume and new projects is moving more to start gravity volume and new projects using gravity is moving more towards app chains that need to take their token to a
Or projects on a theorem that need to take a token to their app chain. And these things are obviously not the two major stable coins USDC or USDT and projects like have made up 40% of gravity's TVL from almost a one, and that person and that that percentage is increasing.
So I think the key questions now are, how do we better.
How does gravity and the gravity community position itself to deal with that change and you know having less stable coins, more different project tokens the NIMS the funds fund token, the, you know, various other, there's a couple of new ones that just submitted proposals a few days ago.
And what what we've been talking about with the auction module changes and the fee discount parameters can be used to start to sort of be more efficient about handling those different types of tokens and take advantage of that volume because that feature
is still something you know the ability to to in a decentralized way and in a permissionless and non gated way take a token from a cosmos app chain to Ethereum is something that's still unique to gravity within the cosmos community and that is really where people are still showing up.
And I expect that there is also demand for other tokens like eth, rap thief, you know, there is, it's not like there, there is nothing to bridge beyond stables.
And this is this is where I've been very happy with the sort of efficiency and decentralized nature of gravity bridge that it's not particularly higher maintenance because let's imagine that we had focused in on building bridges to like 20 different chains and mostly for stable coin demand.
And then all 20 of these chains got made of stable coins that wouldn't exactly be a great position to be in. If you have something that's difficult and expensive to maintain or that has other, you know, operational issues.
And this is why I always really like to focus on making sure that, however, that anything that gravity bridge does is really easy to operationalize and sort of just keeps ticking.
Yeah, and I think that the other the other point is that, you know, what we saw happen was say when they launched as well to is that a lot of the limitations around the, you know, the gating and from the centralized issuers kind of come into play and they hit caps pretty quick.
So, you know, there's definitely opportunity, you know, market opportunity to and just having that, like, let's just instead the permissionless interface where, you know, anyone can can move and there aren't those like artificial capping around that.
So I think there's, you know, it's maybe more spiky volume than like Justin said, the sort of increasing volume around, you know, alternate tokens and people moving from from their token to Ethereum.
But that certainly I think is also a really needed function of permissionless and decentralized bridge.
Thank you. And one more maybe last question about Althea start will tell to expect, you know, all this liquid infrastructure or this fee sharing and band sharing.
Will it, I think, as I understand you will, we will use the gravity bridge to connect Althea to if you so will it share or help to increase the volume, I don't know.
Yeah, so we'll be using gravity as for how much volume it will generate that's really up to, you know, we can't really make solid, solid predictions about that other than, you know, the volume that that Althea is operation already generates for for Gnosis chain where it's currently occupying.
And that's not, you know, it's not huge. It's pretty big in terms of deep end projects, as far as actual customer revenue.
But it's not not a big number compared to what we we would to what gravity was used to seeing from Kanto, for example.
So I think this is more a question of like what will happen over time and in what direction these things are going. Obviously, I'm hopeful.
But I certainly can't say that it's going to blow out gravity bridge volume on day one.
Okay, thank you.
Yeah, we appreciate that, you know, the the outlook and, you know, we're obviously excited about, you know, I think from the Althea team, it's, you know, we've been building that side of the project, or the main, you know, Althea project for for many years now and have seen that traction and
now are scaling and it's it's an exciting place to be. But, you know, I think predictions are one of those things that, you know, we just, you know, what we can't make. Right.
You know, but it is exciting to see the application of blockchain solving real, you know, problems in a tangible way and in an efficient way.
And, you know, we'd love for the gravity community to, you know, join the Althea community as well and be a part of that discussion as there are, you know, a lot of overlap, certainly in terms of ethos and in building, you know, building real things and real infrastructure that that has applicable solutions.
And so, yeah, I appreciate appreciate the the questions. Yeah, I just really want to share that I'm fascinating about the things that one of the rare projects who try to integrate blockchain into real, real world application and it's cool for me.
I always look at your Twitter and it's cool. Yeah.
Thank you. It's pretty cool for us too.
And, yeah, it's been a long road and that's the thing about real applications is that they take a long time to build and they don't necessarily grow super fast.
But when they do, it sticks around, you know, it's not the same as the like one day of enormous volume or enormous use and then it all goes away the next.
So that is it's a very different philosophy from what you see around sort of around the crypto ecosystem.
And it's something we try and bring to everything we do to try and build stuff to last to really work well and be what people would want in a practical sense rather than in a sort of very short term sense.
Well, folks, we're running the top of the hour here. I really appreciate those questions. I think that really really got some great dialogue out there and we got to discuss a lot of those core concepts and excited to see what the auction module evolved and the fee.
The fees become more smart and dynamic.
I think we'll probably put a pin in that conversation for now and meet with everyone here and we usually do these twice a month. So probably back in a couple of weeks.
Please feel free to stay engaged. We do have a telegram of most of the discussion happens in discord. So for those folks who maybe have some questions after this call or want to get involved.
There was also a community bounties program that passed a while ago too, but it hasn't gotten sort of moved forward by the community. But if folks are interested in participating in that, moving that program forward or participating in a project that earns a community bounty.
Yeah, check the discord. We'd love to hear from y'all and keep that conversation going there too.
I really appreciate everybody jumping on the call today and look forward to the third year of gravity bridging and coming back here in 2025 and hopefully sharing a year of successes as well.
So thanks again everybody for two successful years and here's to 2024 and talk to y'all in a couple of weeks.