Lava-naut Bootcamp 🌋 | Rethinking RPCs in the Cosmos

Recorded: Jan. 18, 2023 Duration: 1:00:58

Player

Snippets

That was it going everybody.
will invite one of our speakers up on stage and give a few minutes for people to trickle in. I'm interested to see how this time slot works out as far as like the worship
engagement. We usually do these later in the day, but we're happy to shift our time. So if we find that there's a slot that kind of fits more people's schedules and time zone. So we'll be interesting. Don't usually do midday spaces.
You will have sent you a request. Yeah, me if it's not working or something, but
I hope you can hear me.
Hey, there we go. Hello. Sound. Can you invite Gil? Oh, yeah. And yeah, here he is. Cool. Cool. I sent Clipper and invite as well, but feel free to stay down if you weren't planning on participating.
Cool, so...
Do we think it's going to be mainly YouTube, right? Oh, cool. We have the love and network coming up as well.
Yeah, we have Ethan on love and it's all.
Cool, okay. Hey guys, awesome.
How's it going? Thanks for joining us. Pleasure. Nice to meet you.
I think we can maybe just start with little intros that will give people a time to trickle in. So obviously this is Timmy here on the Spark account and they will be chatting with Lawvin Network. Oh, actually we've got to get into it. Let me shoot.
message to fan make sure terespaces get this recorded. Give me two seconds. Actually, can you make the Lavonette work the co-host instead of me? Yeah, absolutely. Shoot that invite right now.
Hopefully, Finna has a machine open and you can hop in and start recording here soon, but we are also recording on Twitter. Cool, so we'll get into it a little bit. Yeah, today we're going to be chatting with Lovin Network, which I do not claim to be an expert on what you guys are doing, which is one of the reasons I'm
pretty excited to chat and like dive in. It's an area of sort of the cosmos stack where I don't have a ton of experience. But I'm really excited because almost at a high level value, you guys are one of the projects doing something different. Like I said in one of the tweets earlier, you know,
not just another DeFi product or an NFT collection. You guys are looking at the entirety of the cosmos sort of system and identifying new areas that need improvement that have so far been pretty overlooked, mainly centered around RPCs. So I'm excited to chat with you guys for everyone listening. We'll definitely go
through what our PCs are and give you a bit of background knowledge, I would like some as well. But for now, maybe we just start with some brief intros for our speakers. So maybe we'll start with the Lava Network account with Ethan, and then we'll just go down the line. Yeah, please do me. Hey, everyone. Thanks for
joined in the space here. My name is Ethan. I am leading marketing and community at Love Network. I have been with Labr for about, or I don't know, eight months now, almost since the start of its inception. It's been a wild, wild ride. Before this, I was in big tech.
a very different world and yeah, there was the important marketing sales marketing as well and Web3 is something I've been following for a few years now. So super excited to now be working in the space and yeah, that's pay by me. Awesome.
Maybe you will next and then we'll do Gil at the end. Sure, hi everyone, my name is Yvonne. I'm doing marketing and community at Lava. Like Ethan and Malkso, kind of sense inception as Lava is also a very fresh project.
very new. And yeah, I'm not that long in the space, something about the year, was doing some marketing community for a few NFT projects that eventually didn't launch, and then I started doing marketing community for a lot of.
And then last but not least, Gale. Is that what we should. Yeah, that's correct. Yeah. First, thanks for having me. Really pleasure to connect into chat here. My name is Gale. I'm the co-founder and CEO of Lava.
My background is in cybersecurity. I've done reverse engineering and looking into intricate complex systems for over 15 years. And I actually got into the space about two years ago when I learned about anything.
And I built a bunch of bots and it was really, really a lot of fun. And from that, that's when I first learned about the problem of RPC and decided to gather with my co-founder here to found love.
Awesome. Well, I kind of like hearing everyone's stories just because everybody gets to where we are right now in Web 3 quite differently. So it can be interesting, but cool. So I guess I think it'll make the most sense for myself and probably a majority of listeners.
is before we get into the specifics of lava network and how you guys are sort of changing the game like reinventing how it's done or rethinking maybe let's just do an overview of like what RPC nodes are how they fit in
to the Cosmos stack or the unique to Cosmos, we'll just sort of start at the top. So whichever one of you feels maybe best suited to answer, let's just break down in normy terms for myself included, like what exactly are our PC nodes and what roles they play in the Cosmos ecosystem.
Sure, we'll try to make you an expert today. Okay. And if you have any question or you want to dive deeper, let me know. And I'm sure the team can also reply. So
Blockchains are these amazing distributed databases, right?
But to read from these databases, you need to run a node. These nodes basically sync with the network and verify the consensus and verify that everybody has the same database. So if for example there's an application on this database and you want to get the database
data from the application, let's say a swap function, and you want to have the price of two coins, then you need to talk to the database. RPC is the standard way of communicating and getting that data from the database on Cosmos and on many other blockchains.
Okay, awesome. So maybe we can draw an analogy. I think a lot of non-developers are maybe at least tangentially familiar with APIs and how those sort of work in the traditional Web2 world and in Web3. How do RPCs sort of compare, differ, or are similar to
Is it basically the API of a blockchain? Is that a way to think about it or not quite accurate? It's it is let's just think about as an API. There's a slight difference if we go into the actual terminology, but it's very similar and you can think about it as an API. So basically, ARC is the
API to access data from blockchains. Right. So if I wanted to build an off-chain application or part of a larger application that's off-chain, but maybe it needs something like the amount of tokens in a certain liquidity pool just to display as like a number somewhere on the front. It doesn't
need to interact with the chain, it just needs that data. My app would get that through an RPC point that someone on the chain is running and providing for me. Correct? That's correct. If for example, if you want to go, for example, an NFC example, if you wanted to know the price, somebody puts for their NFC for you to buy,
Then you can see the unit in RPC or an API to get the data from the blockchain or the history of NFT transactions, for example. >> Okay, cool. And is this only for like on-chain to off-chain data flow or like if I'm building an entirely on-chain?
chain app and within my smart contract I need the price of that NFT would I still use an RPC or would I get it somewhere more directly on chain. You would not need an RPC for that. That's a great question. You would not need an RPC. You would use whatever is available on chain to get that data. RPC is a way to
get data from the outside from off-chain. Cool, awesome. So I guess now, I think that's a pretty good fundamental breakdown. I'll let everyone know if we have time, we will do a little AMA at the end, and then also my DMs are open
the moment, I won't be checking the Tenderman Timmy account, but if you want to shoot DMs to the Spark account or leave them as replies to this space, I can be sure to jump in with those questions if you want anything more clarified. But cool, so I think one of the interesting parts about current RPC like architecture
how it's set up is that in some capacity and I don't feel qualified to speak on it, they aren't actually decentralized and trustless in the way that other parts of a blockchain stack are. Correct? And so that's sort of what you guys are tackling in a sense.
Yeah, so think about an RPC server or what we call a node in Web3. It's like by this initial and it's a centralized server where you run this node that gives you data, right? But it maybe Ethan would like to elaborate more.
Yeah, exactly. So as Gil was saying just now, you think about blockchains as a network of nodes. The whole idea is that you run your own node in order to get data from that node or the copy of the blockchain state that that node
contains all stores and you don't use an RPC by communicating with the node using RPC how the I mean let me ask you Timmy do you want your own node when you use W3 well so I personally do
not. I think we do as part of Spark, but I don't know or see that side of things. But as an end user now. Yeah, well, yeah, exactly. And that's that's the crucial point that we're trying to address, right? There is no incentive for you to run your own node, especially given how
in practical it can be and how technical it can be for non-technical users of Web3. And so what we see in this second evolution of blockchain access infrastructure or blockchain node infrastructure where essentially Web2 businesses
Webto node providers have popped up and they offer a node of structure to applications. And so now you have companies like Alchemy, Infura, both of which you may have heard of, very large centralized providers in the Ethereum space,
ring, RPC to applications across the entirety of that ecosystem and other ecosystem as a matter of fact. And what you get there is then a single point of failure. When you have many, many applications getting blockchain data from central
type of riders, you have an issue, you have an issue which is called this is not where three. And the issue is that these entities can be censored, they have to be compliant of course, they can be attacked, hacked, we
We've seen Honeypot attacks or DNS server attacks hijacks on another centralized provider called Anchor. We've seen Venice Railer accidentally censored by Fuhrer and whilst these centralized providers offer a great service in terms of performance
They can be very scalable. They are powering a lot of by three so we can't underestimate that respect of their offering. Is it accurate to say that in this current model the only real redundancy or
Yeah, sort of back up that's in place is just the fact that multiple different people run notes. Like there's really no layer to security or redundancy beyond that. That like if I'm going with poker choose node and something happens, I could switch to another provider like min scan, but that's about the extent of the
flexibility you have in the current model right? Yeah I think that's definitely one of the major points that we've done in the SEA aspect. The other aspect is the accountability. When you're using Essentials Provide like Alchemy or Infura they have you know SLA's service level agreements but they say okay we'll give you a sudden
level service. But because they are one of the few options which are which offer great service, sometimes you have vendor lock in and your force to use them even though sometimes their service isn't great. And especially if you think about it at the large depth level or large application level where they have to
a lie and trust decentralized providers and any sort of downtime or inaccuracy in terms of the data leads to huge impact to their users of the doubts. The accountability that you get from these centralized providers isn't great.
get names on credits. Yeah, let me turn it on one subject. I think he's very important. So as we've said, decentralized providers can basically can be a single point of failure. But what's more interesting is that think about them as a pipe that gives you
certain information, right, the data from the blockchain. You basically trust that the data they give you is correct. And I think this is one I got into Web 3. I honestly didn't believe this is the state of how Web 3 works. And I think Monksy spoke about it in one of his blog posts.
How do you know that the data is from the blockchain? How do you know the data is accurate? Like there's basically no proof when you communicate with external RPC servers and nodes and centralized providers that the data is actually true and that struck me something that needs to be fixed.
Yeah, it's almost like...
It's almost the Oracle problem in reverse in a way where it's like getting data on chain in a trustless manner is incredibly hard and similar with getting it off because as soon as you leave the chain itself like trust guarantees are often thrown out the window. Okay, cool, cool.
Yeah, and he touched on he touched on a different current public RPCs that are being offered and I want to talk about it as well. First, I think it's an amazing thing that we have public RPCs and that are that people are offering, but there are a few issues with that because in the end,
So many has to incentivize these RPCs to run. So when you think about scale, as more and more projects will use these RPCs, they will have to throttle and rate-lement the access to those RPCs. So imagine going to the successful app,
And then basically not being able to use it because whichever public RPC is giving service is limiting the amount of requests it can take. And the reason a public RPC would do it is because they're getting paid to run these services, which doesn't make sense, right?
Yeah, yeah, absolutely. And then
It's, tell me if this fits into this conversation because I'm not actually sure. But I've heard reference to like full nodes versus archive nodes. And so it's
Is there also something to be said for like, as networks grow the cost of maintaining or providing these RP nodes also grows or not necessarily? >> I don't like the end of doing that full node. >> Let's say you're doing a full node.
Yeah, that's correct. I think we want to say, well, let me just say that. Yeah, that's completely correct. The more that blockchain adoption grows, the greater the state, right, and the data storage needed to store that state on every node. So that is exactly as correct in which
and ties in with the centralization problem of nodes. >> Okay, cool. For anyone curious, I wanted to double check with you guys that that is applicable to what we're talking about. But for anyone curious, like Archive node, verse full node, it's basically a full node would be
All of the data since the blockchains inception, like the entirety of the chain, whereas an archive node does something like it only cares about the most recent X number of blocks and then the rest is compressed or archived in some way.
not a great explanation, but just for those wondering since I dropped those terms. Okay, cool. - Let me help you out. So in our kind of node, let's say you want to know the state of the balance of your account at low 100, which is like, let's say five years ago.
And Arcadono would be able to answer that question and a full node wouldn't be able to answer that question. So it's more expensive because an Arcadono needs to keep the state of every block. So I actually had a backwards. Okay. Cool.
Okay, so.
But actually, is there anything else you guys want to touch on before we move to? I now want to chat about how you guys are doing it differently, but I want to make sure I'm not getting ahead of myself.
I think we can move to that unless any questions about RPC are anything that we can explain prior. Cool, okay. Yeah, I'm sure some might come up as we continue to chat, but let's move into how you guys are sort of rethinking the entire way that the system works. I saw you make one post
that says something to the extent of we're more than just decentralized RPC nodes. That's almost like a reductionist way to think about what we're doing. So I just want to give you guys the floor. Give us a breakdown of how Lava Network is changing the way RPCs work in the cosmos.
I can jump and down. So having the first thing to say is that Cosmos is a particular focus of ours, but it's not everything. We are a chain on us stick. So eventually we will serve every chain and any chain and new chain will be added via the Dow. But the reason why Cosmos is again
focus is because the Cosmos ecosystem in particular relies a lot on public RPCs. We've spoken a lot about how centralized providers have their own problems, but Cosmos fairly has centralized providers, so there's a tragedy of common problems
there, we're probably obviously which we're looking to solve. Now, touching on live and networking, why are we different? And as you say, more than just a decentralized RBC network, is that some of that I mentioned previously, we also accountable. And so the decentralized
of Lava comes back to what you were talking about in terms of the redundancy. It comes back to what you were talking about in terms of anyone can access the protocol and get our busy data from the Lava network. It comes back to the fact that any provider can join and offer services to the network, but it does
doesn't touch upon the accountability. And so, you know, if you think about, if you think about why maybe other solutions haven't worked in the past, let's say, running your own note, right, when people think about the problem of obviously and centralised obviously, a default answer is typically,
just run your own note. I know two problems with that. One is that you have no incentive to, no financial incentive, and two, you cannot really guarantee the quality of that service is set by relying yourself. What if you could use as many
centralized providers as possible in the most redundant way, but also in an accountable way, where they have to provide you great quality of service that's fast, reliable and accurate. If not, then there's another provider that you can use on the network. If not, then they are there, there are walls with
managed on the network. And so, lava is decentralized in that we have many local providers on the network and so you don't rely and need to trust in anyone. But additionally, we hold each provider accountable for maintaining the standard of service. So in that way, we banish all the demons and
in regards to anything related to decentralized is not fast, is not performance, is slow, etc. We have a decentralized network which is also super high-quality performance, reliable and accurate as well in an accountable way.
And you know, just to I want to add two things and imagine this vision that we have in our minds and this is, you know, what's driving us to build level. You are developer and you spin you spin up
You pay for some sort of service one time and now with your code you can access any blockchain with one payment and you can use not just one centralized providers,
For example, alchemy, you can use alchemy in fewer quick notes, change that. All of them get access to all of those amazing providers, and they're great, but now they're also accountable as Ethan said.
And one more interesting aspect of lava is that I see decentralization come up a lot, and obviously it's extremely important. But it's important to understand as well that this implementation is a tool. It's a tool in our toolbox.
that helps us to reach that end vision. It's not necessarily the goal. The goal is not just to be central. It's the simple solutions of tool that helps us build a really, really good robust, reliable, available network.
Okay, so it seems like tell me if this seems accurate. You got RPCs in the way that they work now. It seems like maybe they're inherently they're going to be centralized. So what you guys are doing is building a layer on
on top that brings all of those centralized providers under one roof and adds redundancy and decentralization and trustlessness through your additional layer. So almost like actually to go back, I love learning through analogies and thinking about stuff is it kind of similar
to how like a cache or chain link works, go back to the Oracle thing where there are the centralized things, but then the actual D app application layer link or a cache works to like provide those centralized services in a more decentralized trust
is redundant way. Is that kind of an accurate analogy? Yeah, I think it's a great analogy, right? An Oracle strives to, let's say, take data from the outside world, like the price of two pairs of coins, for example,
and that data is impossible to get just from the block because you have centralized exchanges that trade those pairs. So there's a consensus of Oracle node runners that communicate in an algorithm.
and then they send the transaction to the chain with the final result. All these reports come on chain and you can get the price data from the price feed from outside. And Lava is very similar only we get the data from the blockchain in an accurate way for the end user.
Yeah, that makes total sense. It makes so much sense in fact that I think my next question is how has this not been tackled already or maybe it has and it is led to failure if so if you guys know of those sorts of things like is anyone else tackling
has this been explored before for our PCs as far as you guys are aware? It's been explored by many, many very smart people actually and I've read many research pieces about it and it also connects with life clients a lot.
Even though like clients solve a different problem. So I feel like it's a there's a lot of explored There was a lot of exploration and is a lot of exploration field however, it is not being executed According to the vision that we see
Okay. So do you anticipate though that?
Actually, is there a future where something like what you guys are building maybe even your guys exact tech gets implemented as like a
a default through being part of the Cosmos SDK or something like that. Because like this seems now that we're talking through it, it kind of seems like a no brain. So my head is at like
Why would we not all move to this sort of model? And if we do that, why not make it a more core-based part of the SDK? But also, remind our not-a-dev, so maybe that is a stupid thought, and I'm overlooking something. I don't think it's stupid at all. I think it's part of our--
I think it's great. Yeah, and we do aspire to get there and we are building an open source protocol. So it and we start in cosmos. So we feel like it could easily integrate into any cosmological
And I will actually support a lot of many, many Cosmos chains in the process of adding more. So I think as we develop it and becomes more mature and robust, I think for sure we're looking to integrate it into the Cosmos SDK if possible.
Yeah, okay awesome. So maybe let's dive in a little bit more into some of the nitty gritty We'll definitely try and keep it like normay centric so people like myself and the audience can understand but Within this system
I as an end user who might be building something where I need an RPC endpoint will obviously benefit from what lava is doing. I think lava and kind of have a one-stop shop that has that redundancy built in. But if I'm an RPC node from
provider who's already running one. Do I reap any benefit from what you guys do? I'm almost wondering if part of the roadmap might be some form of monetary incentive or inverse monetary incentive, like a lock up vested thing that could be slashed or something.
something that on top of the system you already have further incentivizes RPC providers to be honest or something like that. Yeah, so I guess that'll be my question. As an RPC provider on the other side of the equation, what benefits would I reap from lava?
Then we ask in all the right questions. So yeah, most of what we spoke about today has focused on the developer experience and all the benefits there. But as we've kind of touched on as well, a lot of RPC providers out there, including chains that run public RPC, they do so for free.
There is no built-in incentive, like I don't know, in a state run of metadata, there's a built-in incentive on-chain for that. So what would happen is providers would join our network, the decentralized network of providers, and the network will serve essentially like an open market, an open market for RPC, where
developers, building applications, they would pay subscriptions in exchange for RPC from the RPC providers. It will be paid for the first time pretty much given that previously they were running free endpoints. So not only are they paid, they are paid as you say based on the quality of service.
If you have great service, then we paid more. If you have maybe poorer service, then we reward less and also paired with fewer developers and applications. How do you guys grade that quality of service? I assume it's in algorithm, it's not like you or the community picked
in choosing right? Yeah so there are many ways that we do quality service so we have a quality service score where clients will actually score providers based on the quality of their service so that's of course three dimensions the latency are you to speed and response the availability
of the node. So, till I go down often, till as you respond. And then finally, the accuracy. The data integrity is a fresh, is a sync to the latest data blockchain. So this is what I'm referring to when I talk about accountability. Because when you use a centralized provider like Alchemy or Infira,
They can be giving you data not from the blockchain. How, as Gil said, how do you know it's originating from the blockchain? How do you know it's the latest state of the blockchain? You don't. So, you know, you will see on Twitter that whenever there's a crazy NFT mint, someone will be buying an NFT or go through.
but then they won't get the NFC. >> Well, actually, let me slightly rephrase my question. So that exact problem that you just illustrated of not knowing whether that data is trustworthy as builder. I guess I'm wondering how your protocol does know that if it's an unknown
So, and I assume it would be something kind of like how chain links detects bad, bad providers where they'll look at their whole host of data coming in, figure out what you like agreed upon averages and for those that divert too much, maybe then you knock down the scores, is it something like that?
100 and many will change its mechanism, but it sounds similar. So what we do is we are probabilistically and randomly sample responses from a provider and then we compare that to another provider. If there's a conflict, we'll then bring it on chain for a vote essentially among all providers that would then determine what
is the correct answer by consensus. So, this is our conflict detection and resolution mechanism. Right now is the honest majority, but we are planning to use a jewelry and also like lines to reduce the trust assumptions even there further. Maybe, maybe, Gil wants to expand on that, actually.
Yeah, of course I would love to. And you answered it pretty accurately, to be honest. And it's with two aspects. And these are two big features that we are working on. So, and by the way, we've been thinking about this problem since they won. Like a whole design and goal of lava was to
build a way for us to have trustless or pissing for ourselves as developers. So, quality of service is one feature and that reliability is another feature. The quality of service, as Ethan mentioned, is a way to determine how good is a provider at giving you
data fast and how fresh is the data and making sure that they're available and the data availability is conflict detection and conflict resolution. So a detection is how do you know that the data is incorrect. So the way we know that is incorrect
is by sampling responses. So, off-chain. So, we ask one provider a question, "What's my account balance at block 100?" And they reply, "It's one-eath." And then you ask another provider, "What's my balance at block 100?" And they reply, "It's two-eath."
know I have a conflict. So I take this conflict and I have two hashes. There are different for the same request hash. So I can prove there's a conflict. I take it, put it on the chain, and then the chain can resolve it with conflict resolution, which is done by automatic loading.
Okay, okay. Cool, so...
So actually, let me use the two same analogies again for this next question. So like, whereas something like Chainlink might have different numbers
coming in that are accurate because it might be pulling from different exchanges. Like let's say the price of a certain coin. And so it does some like averaging magic in there. But
So with the chain link model, like the number that you would be getting from using their service is actually something that chain link has derived. It's a number they've came up with from these other sources. Whereas like a cauch if you go and use a cauch, you're not really using like
a cautious compute or anything, a cautious just connected you to one individual who is offering compute that was like a match for what you what you wanted. It sounds like you guys are like more so on the a cautious side of things in that
that the data I would pull from using Lava Network is not a number that you guys are deriving from all the inputs, but rather from a individual node that you guys have assigned to my request until it no longer is
providing accurate data. Is that correct? I think that's a great point team and also like to add context to what Gil and Nathan answered before. So the communication between consumers which are applications and node providers
RPC providers, it happens off-chain. And so that's exactly like you said, and what Lava does, the protocol on-chain does, is the pairing. So it gives you a list of relevant providers to
the chain and the geolocation that you're looking for. And it also does the settlement for node providers when they need to get their rewards. They upload it on chain and the same goes for
what Ethan explained about conflict resolution and yeah.
Yeah, so it's peer to peer. The communication is peer to peer between consumer to provide. Yeah, because we need speed, right? Yeah, imagine how slow it would
be if you would go for blockchain for every query. And if you're talking about hundreds of millions, if not billions of queries, so it has to be really fast.
Okay, cool. Yeah, that makes total sense.
Sweet. So I guess I guess actually should have done this earlier. This is something I want to start doing with all of these spaces, especially when we're chatting with more like tech oriented projects. And feel free to jump in here guys, because I can already tell I'll probably flounder by the end of this a little bit. But
I want to quickly just touch on like why people in the audience should care because a lot of people in here are not necessarily devs or node providers themselves. They're used to hopping in spaces where we're chatting with an NFT collection or some new decks with an investible token. But I think
I think this should be just as exciting. So I guess what I'll say is the whole age old saying of a system is only as strong as its weakest link. So we have all this awesome architecture and cosmos that is decentralized and trustless.
you guys pointed out earlier, cosmos more so than other ecosystem really relies on and uses RPC nodes a lot. And so with that part of the system currently not having much oversight redundancy, trustlessness, like it's not something that
the average user will see a direct benefit from, like if lava network gains wide adoption, but it does matter a ton like in a secondary capacity. So a lot of the times when you're using a D app and something's going funky with the interface or with prices loading, a lot of the time that could be because
of RPCs. So one effect is like you'll by supporting lava and seeing lava gain traction, you'll probably have just a better UX experience on apps across the cosmos. Then there's a whole other angle of the like security and even though we touched on our
earlier that like if I'm building something like a lending protocol that's on chain, I won't be getting my data from RPCs necessarily. It can get it on chain. But that doesn't mean that there aren't like a near infinite number of other avenues of attacks that someone like myself wouldn't think of. If you're not a hacker or a pen tester,
So, you know, for all we know, it's something like lava network not taking over will affect us all directly through some sort of hacker exploit down the road or some sort of scam or whatever it might be. So, yeah, feel free to jump in here at any point guys.
But why should the average sort of normy user be excited about overhauling how Cosmos uses our PCs? Because it's not super glamorous, like I'll admit, but I do think it's crucial that it's a core program. We think it's glamorous. We're trying to make it glamorous again, you know?
We're trying to make infrastructure glamorous. It's stuff at times. We're doing our best. As you've said, and I think you touched on really the main points, right? User experience number one. You'll have better prices faster. You'll see what's happening.
on the chain faster and more reliable. For example, if an RPC is down, this happened on Ethereum. I think if I remember correctly, if your R was down for like a few hours, that means that retail or anyone using Metamask and many other worlds was simply not able to
They weren't able to swap. They weren't able to withdraw their funds. They weren't able to buy anything. So they weren't able to see their balance. Because it just wasn't working. So yeah, so I think that really sucks.
option grows, especially in the next cycle, we want to make sure user experience is tight and that you can always get an answer about how much money you have, how much tokens you have, what's going on with the prices.
It's extremely important. The second thing you touched on, security, we sometimes look at security as like a secondary thing in importance. But actually when it comes down to it, security is really, really important because that
If something is not secure, we don't have trust in it anymore. If we don't believe the data we see, if one time we get fake data and we've made a financial decision based on that fake data, doesn't matter how small or big, that means that
that we can't trust in our system. This is why security is actually, in my opinion, it's a top importance for users and for everybody in the ecosystem. So I think these are the two greatest things.
Ethan or Enval anything to it? Yeah, I want to come at it from another perspective as well. I think Gills touched upon the practical benefits, the user experience, the security as well as a community manager. I'm thinking about the community as well and
What I want to say is that if you believe in Web3, you should believe in another. Right? Lava makes Web3 possible. Right now you use Central providers and the data that you get, the data from this supposedly magic database, the blockchain where you don't need to trust anyone.
data that you get from a centralized provider could be anything. It could be an illusion. You will never ever get a full benefit of blockchains of Web3 if you
not using a decentralized gateway to blockchain data and that gateway is lava. And the other thing I wanted to say is that the vision as well for the community moving forward is how can lava be that gateway in more ways than
just being in an API environment, an RPC environment. You might have really seen on Twitter we're supporting communities as well, smaller communities, smaller apps, applications, and we want to become the place where people also
come to learn about new projects. And those projects hopefully are user-marva as well as the game with 2-proption data. And so that's another perspective as well. If you believe in Web 3, Live is should also be top of your agenda, not just for a user experience, for a great user experience.
but also for discovering all that way through it has offered through the gateway through the logit. I love it. I'll have to chat with the like validator side of my team, but air drops is in the audience
right now. So he's actually listening to Man in Charge. I think Spark would probably most certainly be interested in spinning up our PCs once law is ready to go and participating. I love what you guys are doing. And yeah, on the security side of things, one thing I'll just
is like some of these Web 3 exploits and hacks or even just scams that we see. Look, I'll read about how they were done and just like I never would have thought about that. That is such a weird, you know, exploit in the system or way around so like
Yeah, that's how we think about RPCs too. I think some people might think of RPCs as not as core a part of the like security model of Cosmos chains, but I think it's really dangerous to discount anything, especially something as frequently used as RPCs.
Cool, so I guess the next and one of the last kind of questions I have is like for not for an RPC provider and not for someone building an app that needs RPC nodes.
We already just touched on how the user will sort of indirectly benefit from lava network through it improving the cosmos. But how could the average user directly interact or participate or benefit from lava like a
you guys going to have a token? If so, what will that control? Will there be ways for, yeah, open-ended question, I guess. How will an average user be able to sort of participate in lava network in some capacity?
Yeah, we're thinking of ways to have the end user be able to access the love and it work directly. There are still working progress. One of the ideas we have, for example, is to have a browser extension or four of popular wallets that
use the love of the network directly so they get the full value of the network. This is something we're still figuring it out. One thing that the consumers can do or the actual end users of the RPC, once we're out there more established and
in production and maintenance, I think they can demand their wallets to introduce support and they can request their dApps that they're using to use lava so that they can get the latest freshest, the most verified data out there.
Okay, cool. So then I guess I'll like two questions rolled into one here. So no token. It sounds like
No, we're working on a token based, it's a cosmos chain, right? Love that. And yeah, it's an ad chain built on cosmos, which will most likely have a native token. And that token will be used for the operation of providers and consumers and validators, of course.
But we just know to ready to share too many details on that as it's in the works. Okay, cool. Yep, you answered the second part of my question there. Yeah, I was curious if this was going to be sort of a floating protocol that lives wherever or if it will have its own sort of chain as a home. So that's really awesome to hear.
Cool. We'll have to make sure I forget because we've gone through hundreds, but have you guys submitted any info or chatted with us about your page on Interchain info? We definitely want to make sure that's filled with some good information about you guys.
and looking pretty. So we can work on that after this call if we want. Yeah, for sure. Yeah, I don't think we have that. So yeah, I'll see. Well, I'm sure you guys have a page already, but I'm sure it's what Wikipedia would call like a stub of an article. So we'll work on expanding that together.
Awesome. So I guess I should have mentioned this maybe a couple minutes ago, but if anyone does have questions, definitely feel free to start requesting. We're nearing the top of the hour. You guys have to run at the top of the hour.
I have to run it up with the other power. Okay, that works out for me as well. There's another space I'm going to try and pop to. Yeah, if anyone has any kind of like final thoughts closing questions, feel free to come up and request now. And while we wait
for that. I guess I'll just kind of give you guys an open floor. Is there anything we didn't touch on that you'd like to and didn't specific anything in particular that's maybe upcoming, whether in the near future or much later down the road that you guys are particularly excited for or want to share?
Yeah, there's one thing that we have a couple of talks, a surprise about which is privacy. Oh, yeah, love it. You know, love it. Speak this 10 minutes to do that, actually, for sure. Yeah, I mean, love there's many things fast, reliable, accurate, decentralized, but also private.
And you see this as a constant problem in space, right? Especially if you're using central providers. If you're recently announced a change in their policy, which was updated to include, I think, capturing IP addresses, you have what any provider can basically
connect an IP address to the queries that are made and how a lot of the souls that is we randomly pair you to list the providers and then we you can think that we basically scrambled the your queries your obviously calls between those providers so no one provider
can profile individual accounts, an individual application or user. And so with that, you get increased privacy. Down the line we'll have another feature, privacy mixer, which will entail total privacy. And maybe a girl wants to talk a bit more about that.
Sure thing. Yeah. So yeah, privacy is really important in Web3, right? We're trying to make sure no one can track and get a full picture of the user. And what's the reason behind that is the reason is that if someone has access and can see
The type of data that you're looking at, they're more well suited to sell you stuff or try to target you to do things. So it's very important for us. Ethan was mentioning a feature that we're working on, which is a privacy mixer. A privacy mixer would
basically allow you to use the same facilities of lava, but use them from different accounts. So from different source wallets so that the providers were basically serving your
your data to you, they don't know which account belongs to which user. So they could basically make whatever, like incognito mode. Think about incognito mode for RPC requests, I guess, that would be the best analogy.
Okay, cool. Yeah, how would we not even touch on on this so?
I guess that's really important to keep a mind out for in everything we build because blockchains aren't very private. It's one of their downsides is that you can see everything. So taking steps to add any amount of
of potential privacy at like the various steps along the way when steps of using building a de app, etc. I think is really, really great and it just adds that level. So is this, would this be opt in opt out or just
default for everything. So the first feature that Ethan mentioned, that's default. That's just the way lava works. It scatters your queries. So you don't have to opt in route. This is by default you get more privacy than you would have done by
working with a single provider. The privacy mixer, that's an opt-in feature. It's currently not implemented. It's in design phase, but we're hoping to bring it some-- OK, cool.
All right, and just because we do still have eight minutes and I'm just sort of thinking through.
Mainly right now my head is think kind of turning around the different types of RPC providers, whether they're small private ones, businesses doing it. Like, when might someone opt out of the mixer?
Like, is there any use case or situation you guys have thought through where it's like, oh, we want it to be opt in because someone like X might not want to use it or
It's a good question. I think that
Most users would probably want to use it. Yeah. And maybe it would add a bit of a burden that requires you to use the mixer to create new addresses and then use those addresses to make queries.
Honestly, that can probably be built into our code on the front end, so you don't have to do it manually. Oh, wait, I actually might be slightly misunderstanding. So who would be opting into it?
Builder that needs an RPC node or the provider that's running the builder the builder. Okay. Okay. Gotcha. Gotcha
So it's from the yeah, so it's a way for for you to make a query. Think about a query like going to Facebook and like searching something there. Nobody I'm sure nobody wants other people to know their right Facebook or what is it? Instagram search so
This is basically what RPC is, right? You're making these searches, what NFTs are you looking at, what tokens you're looking to buy, what prices, etc. You don't want people to know that. Not just people entities, because they can take advantage of that. So by using privacy mixers, you can make these queries
is from an address that is not associated with your user. So it's much, much harder to know that you're making this queries and that's what makes it unique. Okay, cool. Thinking about it more, I don't have anything specifically, but I do feel like optional is the right
route there. I can kind of potentially see some projects that for various whatever reasons they might want that to not be fully mixed. But okay, super cool. Well, I think I'm gonna double ask here just because we almost forgot to touch on that, which is awesome. Any other
other little points we want to touch on that maybe we missed or things you guys wanted to mention. Yeah, the last thing is if what you're hearing sounds like somebody that's used we are currently open in terms of the testnet. We have it announced on Twitter yet because we wanted to keep it for
I remember this call, but since you're on this on this space and you're listening, you've been listening for the last hour to us, to us, to us, to us, to us, to us, to us, to us, to us, to us, to us, to us, to us, to us, to us, to us, to us, to us, to us, to us, to us, to us, to us,#
So do come to our Discord. You can use Live and Now via a sort of beta offering or Live and Gateway. And you can part of our feedback. You can spot bugs for us. You can hopefully participate in chess tournaments.
Jump on a call of a dev row engineer who is just hired, who is amazing, who would love to speak to you guys. And yeah, Genimov comes to this code. Try a lot, but we'd love to have you contribute towards the decentralization of blockchain data access.
Awesome. Love it. Yeah, everybody give love a network of follow. Thank you so much for joining me. All three of you to take this time out. I think we think we made very efficient use of it. I feel like I have a better understanding of how our PCs fit into everything and come
something with that now a bit of a burning passion to see you guys succeed because I think we need to improve them. So awesome, I guess real quick speaking as Tenderman Timmy here and not the spark official account. As some of you might know I recently joined the Umi team which
by the way we should definitely chat about making sure Umi and Lava are working together. But so I'm on the Timmy account gonna go be popping over to a space that they're hosting now with Chengo so if anybody is still hungry for more spaces we're gonna spend this one down but we could
pop over there and join me. But thank you guys from Lava so much for joining today. This has been awesome. We should definitely do another one of these a couple months down the road or maybe when I should update or main net launch. Yeah, it sounds amazing. Thank you so much for hosting us. Thank you very much.
Thank you very much. You're positioning Oomi. By the way, we love that. Jane and we would definitely want to work with you guys. Cool. Cool. We'll make it happen. Or I'll do my best, I should say. But good. All right. Everybody who came as a listener. Thanks so much. Your support is great.
Terra spaces, appreciate you kind of jumping in at last notice. We did record it so you can steal the beginning off. However, you do it. And yeah, everybody have a great day. See some of you in the next space or next week. Thank you, everyone. Thanks, everybody. Bye, bye.

FAQ on Lava-naut Bootcamp 🌋 | Rethinking RPCs in the Cosmos | Twitter Space Recording

What is the purpose of Lava Network?
Lava Network aims to improve the RPCs in the Cosmos ecosystem.
What is the role of RPC nodes in the blockchain?
RPC nodes synchronize with the network and verify the consensus to provide data for off-chain applications.
How can RPC nodes be compared to APIs?
RPC nodes can be thought of as APIs for blockchains. They are used to access data from the blockchain.
What kind of data can be obtained through RPC nodes?
Data related to the blockchain's transactions, such as the price of an NFT or the amount of tokens in a liquidity pool, can be obtained through RPC nodes.
Can RPC nodes be used for on-chain applications?
RPC nodes are used primarily for off-chain data flow. On-chain applications would access data through the blockchain itself.
What is the background of Lava Network's CEO?
Lava Network's CEO has a background in cybersecurity and has been working in the blockchain space for about 2 years.
What is the experience level of the Lava Network team?
The Lava Network team is relatively new, with most members having worked on the project for about 8 months or less.
What is the main focus of Lava Network's marketing and community efforts?
Lava Network's marketing and community efforts are aimed at building awareness and engagement for the project within the larger blockchain community.
What is the potential benefit of Lava Network's improvements to RPCs?
By improving RPCs, Lava Network could make off-chain applications on the Cosmos ecosystem faster, more reliable, and more accessible.
Is there an AMA planned for the end of the podcast?
Yes, there is a planned AMA for the end of the podcast. Listeners can also DM questions to the Spark account during the podcast.
What is the purpose of the podcast recording?
The purpose of the podcast recording is to chat with Lovin Network and discuss their approach to the Cosmos system.
What does Lovin Network focus on?
Lovin Network focuses on improving RPCs within the Cosmos system.
Who are the speakers in the podcast recording?
The speakers in the podcast recording are Ethan, Yvonne, and Gil from Lovin Network.
What is Ethan's role at Lovin Network?
Ethan leads marketing and community at Lovin Network.
What is Yvonne's role at Lovin Network?
Yvonne does marketing and community at Lovin Network.
What is Gil's role at Lovin Network?
Gil is the co-founder and CEO of Lovin Network.
What are RPC nodes?
RPC nodes are used to read and verify data from a blockchain by syncing with the network.
What is the standard way of accessing data from blockchains?
The standard way of accessing data from blockchains is through RPC.
What is an off-chain application?
An off-chain application is a part of a larger application that does not interact with the blockchain, but needs data from it.
What is the difference between on-chain and off-chain data flow?
On-chain data flow occurs within a smart contract on the blockchain, while off-chain data flow occurs when an off-chain application needs data from the blockchain.