Okay, welcome into another awesome spaces that we're hosting today with two great teams that
are building on Avalanche and excited to talk to them about all the great things that they're
bringing to the ecosystem.
So we're just getting the spaces opened up, getting things started, welcome into those
that are coming in right on time.
And we're going to get started here in a minute or two, give some time for our speakers to
join in and some more people to trickle in the room.
And then we'll get started talking with PIF Network and Movement Labs about all the awesome
things that they're bringing to the Avalanche ecosystem and there's a lot to dive in here
So thanks for coming through, see some familiar faces already and we're excited to have this
So back on another spaces with two great partners that we have.
So hang tight, we're going to wait for our guest speakers to come in and some of the
Avalanche team and then we'll get started here with our conversation with PIF and Movement.
So appreciate everybody coming through, hope people are having a good Thursday so far and
a good start to the year.
All right, let's see who we got here in the room that's coming through.
Welcome, welcome, welcome.
I see things are starting to fill up.
Let me get Cooper up here.
Cooper, how are you doing?
Jim, Jim, doing really well.
Excited for the combo today, thanks for coming through.
Yeah, thank you for having me, big things to come.
Just excited to hear how things have been progressing and I know you and the team are moving quickly
so I'm sure there's even lots of new information to share and to hear about.
Yeah, I'm at big scare right now so just having a good, easy morning drinking some coffee and
I'm going to go skiing after this Twitter spaces.
Well hopefully we can make this hour worth it for you so that you can head back out there.
But yeah, we're getting the rest of the speakers up here.
Mark, let's test your audio.
Jim, Matt, thanks a lot for having me.
Yeah, excited to have you.
Thanks for taking the time and coming and joining us.
And we're pulling up Eric here as our last speaker here.
Let me just make sure this goes through.
All right, Eric, what's up, man?
Thanks for coming through and helping to host this so looks like we got our four speakers
up here and I know there's other folks that are just joining in.
So we're kicking off today's spaces, just talking about building on Avalanche,
right, bringing really critical tools, infrastructure that not only helps the
Avalanche ecosystem, but I'm sure is going to spread far beyond what's happening
here. So we're going to kick things off here.
Let's do a quick round of intros and then Eric and Matt, I'll let you take it
away from there, but wanted to first kind of introduce the two guests that we
have. And so Cooper, why don't you give us a little quick rundown on yourself
Yeah, pleasure to meet everyone.
So my co-founder and I, we dropped out of college to build in the move ecosystem.
Before that, I was doing this backed out vehicle.
I came across the move because Solidity was a big pain point for creating
these really asset oriented, ownership oriented stratify structures.
But then with the move, you have faster performance.
You've got better security and easier development.
And then the way move is originally built by Facebook, then came to Web3.
But even then, it was still a little bit centralized.
And so we founded movement to democratize the move language, decentralize it
and bring it to market in a Web3 native way.
Awesome. Yeah, thanks for the background there.
It's I know you guys have been working a lot of real cool stuff that we can
dive into. And then Mark, why don't you give us a quick intro on you and
Yeah, of course. So Mark, I work for the Peace Data Association.
Day to day, I mostly focus on the DeFi footprint of the pet network.
So be sure that the amazing people building DeFi can integrate with the
Peth Oracle. And so the Peth Network is a blockchain Oracle specialized in
So today we have more than 400 price feeds updating, like I'm a sub second
latency, and those are available on close to 50 blockchains.
And these 50 blockchains, like 40 of them must be EVM, whether it's layer one,
alt layer one, layer twos, even some app chains, not yet subnet, but who
knows? I mean, with movement, it's going to happen pretty soon, I guess.
And very much the idea of Peth, if you had to kind of explain to your parents,
is the whole idea of Peth was that Oracle existed before Peth.
We have Chainlink, especially from 2017, 2018.
But we kind of recognized two, like at least two areas where we could improve
is how to distribute data.
Like the Oracle financial data on top of it is a very big business.
Like it's with billions of dollars in the, let's say, web to world.
And so the idea was to kind of create a Spotify like for financial data, to attract
data owners, data creators to contribute it, aggregate it, the Oracle
aggregates it, and then is available for builders to build on top.
So very much that's the whole, let's say, Peth idea.
Go find data you wouldn't find elsewhere, aggregate it, and make it available to
That's amazing. Yeah, it's really exciting to see what you and the team are doing.
So excited to dive in more.
And then from the AvaLabs team, give us a quick intro, and then Eric and Matt,
I'll let you guys take it away from here.
So myself and Matt, we're both part of the AvaLabs BD team.
For those maybe new in the audience, AvaLabs is the primary development company
or service provider behind the AvaLabs blockchain.
We, on the BD side, we work with partners across a variety of verticals.
Matt and I specifically set a DeFi side of things.
But we also work with different and different infront partners,
such as the movement team here.
And then we have vertical teams tailored to gaming, NFTs, enterprises,
institutions, you name it, and we try to cover everything here.
So as for me personally, I've been here about a bit over three years now.
Started off in September 2020, right when the network launched.
Before this, I was in a pre-traditional finance consulting career track
before I went down the crypto rabbit hole.
Matt, if you want to see a few words as well.
Yeah, so my name is Matt.
Been here at AvaLabs for just about two years.
Like Eric mentioned, originally joined to be focused primarily on DeFi,
but our team kind of does it all within the ecosystem.
And so I've helped out quite a bit with institutional data
infront tooling, security projects as well.
And it's funny, we've kind of been incubating and helping the movement team
since the very beginning of when they started.
And so I remember meeting them last in Denver in person
and as well as November of 2022 for the first time.
And now here they are with tons of hype and almost a live test net,
as well as many builders, as well as users very excited to use
but they're subnet and roll up.
So yeah, I guess popping right into questions,
I guess building on Avalanche,
would you mind giving us like an overview of Pithmark,
maybe the architecture behind it, what exactly Pith is, that sort of thing?
So I think the first thing for the people to know is
Pith usually don't work like other Oracle.
Most of the people know Chainlink and they've been working
kind of the same way for the past few years.
And the idea is that they push prices at regular interval
all depending on parameters, every X minute, every X second,
every X percent price deviation.
And so the first thing we recognized at Pith was
you'd need to bring more customization to developers.
And so we built Pith kind of as an on-demand Oracle or pool model.
So that's arguably the Pith Oracle,
it constantly creates prices on its own blockchain called PithNet.
It has like two or three block per second.
So it's pretty much as fast as the fastest blockchain.
And all these prices are exposed to anyone for free of chain.
And at any point, anyone can fetch these off-chain prices
and their associated proof to send those onto the Pismort contract
on average or any other chain to make those price available to them.
So this whole design kind of enabled especially perpetual protocol
to one have access to the 400 price feeds the Pith network reports on its own blockchain.
And also get price updates on-chain whenever they need it.
So if we take the example of push Oracle for perp,
let's imagine you have an ECSD price feed that updates on-chain automatically
every 0.5 percent price deviation.
Someone looking on Binance could easily compare
like what's the current price of ETH on average?
What's the price on Binance?
And pretty much you can front-run that if the price is going to go up on Binance,
it's going to be a bit like the price of ETH on average that push Oracle send,
it's going to be a bit laggy.
So you could almost take long position before the Oracle would update
towards the price that is like live off-chain on many centralized exchange.
So this on-demand model actually enabled to get the price at the time you want
And for example, I think the biggest perpetual protocol using Pith today is Synthetics.
They manage to support, I think they support like 75 markets today
and have done close to 40 billion or just over 40 billion in trading volume in 2023.
So one core aspect of Pith is this on-demand model.
I mentioned that it gets you and I think it's a really good benefit of Pith
or shows how much it scales is that the same catalog or the same price,
all the price feeds Pith supports on its Pithnet we call our own blockchain
are actually permissionlessly available on all the blockchain
we've deployed our smart contract to.
So this kind of, usually I compare it to a food buffet.
Like we're not going to bring you the data to your table,
but we're going to actually enable you to just visit our buffet of 400 price feeds.
You pick whatever you want, whenever you want,
and then kind of go back to your table, eat it.
Or in this case, your DeFi app leverage this price
and whether it's feel a trade, whether it's do a liquidation can do it.
So very much this part of on-demand model was let's say quite novel
when we shipped it like a year and a half ago.
And since we've seen like redstone having the same.
So that's super cool to see that like perpetual protocol became way more efficient
with this on-demand model, less fees charged, more assets supported
and overall volume has grown.
So I think this is one core Pith of Pith.
How can we make it happen?
Also, it's thanks to wormhole.
So I mentioned Pith as its own blockchain.
So Solana virtual machine.
So we forked pretty much Solana and all the data publisher,
we have 100 or close to 100 today are also validator of this blockchain
and constantly send transaction to it saying Bitcoin equals X.
And on this Pithnet on this blockchain,
there's only one single smart contract, which is the Oracle aggregation contract.
We created, here we go too much into these.
We kind of created a token there, but it's kind of a useless token,
And it works kind of a proof of stake authority chain.
So only data publisher, all validators.
So this also we can ensure that even if one third of the publisher go down,
their validator go down, the whole network continues to work.
And just on top of it, you have wormhole that enables us to create proof
of prices and enable application to send it to any chains.
So overall, you can imagine tech wise, you have Pithnet SVM chain at the bottom,
wormhole on the top that observe what happens and create proof of what happened.
And all this is streamed directly to application to anyone,
And then when you want to use the price on chain,
you have to send an evidence transaction with the proof of price.
So you can consume it there.
I guess two more follow up questions from my side.
How does Pith compare cost wise in USD from a developer standpoint
in terms of pinging the oracles, but also maybe as well,
how heavy is integration for new chains, that sort of thing.
And then another question is in relation to what all protocols
on Avalanche are currently using Pith.
Yeah, so cost wise, it's going to vary quite a lot.
And it's pretty, it's kind of designed.
So being permissionless, and as I mentioned, the Oracle itself
never pushes prices by itself on a chain.
This is delegated to the end user slash the application.
So let's imagine there is a single application that
wants to use Pith on Avalanche, let's say.
There'll be, at the end of the day,
in charge of doing all the price updates.
Depending on the type of application,
you have a various way to integrate it.
If you're a ball ending, by nature,
there is less leverage on ball ending protocol.
So you can have, let's say, less frequent price updates.
For example, recreating the chain link model,
where you have a price update every five minutes
or every few bits price deviation.
And so cost wise, let's say you do 100 price updates per day,
it would be like 100 Avalanche transaction.
So if an Avalanche transaction, I don't know,
it will cost you to use Pith, $10 per day.
So very much the big chunk of the cost
is just going to be the application
doing the Avalanche transactions themselves.
Depending on the application,
you can have to integrate it in such a way
that you don't actually pay for it,
but the end user pay for it.
So I mentioned the example of pair protocol,
In this case, you can actually have the trader
being in charge of doing the price updates.
So whenever he's going to click long or short,
the UI and the contract's going to actually craft
the price updates and embed it into the user trait.
So that's when the user, the trader will not only,
he will actually do first the Oracle update
and then the trades and get filled
that price is just sent on trade.
When application do this,
it's actually free for the application,
but at the end of the day, the user's going to pay for it.
So you have nice mechanics,
whether you, let's say reduce trading fees
because the end user is going to actually pay
or you can from the app take on the gas fees,
but just charge more as trading fees
to kind of cover up that cost you incur.
Application, and one thing I want to add actually here
is the piece smart contract on average,
once it receives a price update,
anyone else on that chain can access it for free.
So you kind of have a network effect
as in the more users or the more price updates,
the cheaper it actually becomes for everyone.
So I think today, for example,
because I think it's where we have the most activity
they do up in between 20K and 60K trades a day,
every time they do a price update.
So if you were an application on that chain,
you'd be able to almost free ride all these price updates.
You have to be careful when you free ride
because what if they stop trading that asset
and you just are stuck with an old price,
so it's more on the safety design when you integrate PEF,
but like it's almost kind of on the consumer side
kind of design as a public good.
Someone does price update,
anyone else can read that price update for free after.
And specific to Avalanche we have,
and it's like pretty cool design is OmniBTC.
So they actually first integrated,
so it's a ball ending that is both on EVM,
but also move chain, specifically SWE.
And so they actually first integrated on SWE
and since kind of expanded from their EVM to PEF.
And so for them, it's like this on demand model is great
because regardless of if they are on SWE or Avalanche,
they get access to the same price feed catalog.
And so can support easily the same assets,
have the same frequency of updates on every chain.
And the another one that is more too common,
actually we had a space recently with them,
it's Pike Finance, they are available on testnet
and I'm pretty sure that mainnet is coming very, very soon.
So we shall have a bit more PEF users on Avalanche.
Nice, I guess final question for me
before we kind of transition over to the movement team
is in relation to, did you guys have a thesis
for starting with like Solana
and non EVM chains first or alternative VM chains?
Did you guys see like an opening in the market share there
and decide, hey, that's gonna be the easiest way
for us to gain market share
or what was the thesis behind that?
So I think it was, so we initially launched the V1
let's say of PEF in mid 2021 on mainnet on Solana.
And today let's say our V2 has been live on like 50 chains
for like from end, yeah, towards the end of 2022.
I think it's mostly on the design side.
So we very much wanted to have an Oracle
with price feeds update fast.
So Solana back then and arguably still
among the fastest chain overall, like 400 ish milliseconds.
So the idea of having price updates every
or two or three times a second was overall good enough
for the current state of DeFi.
But mostly the other part of,
we actually wanted to do everything on chain.
So from the publisher sending their input
because it's everyone can verify
and like ensure that the smart contract aggregation
like the aggregation of all the inputs
into the output is done in a fair way.
So the speed and the locus of Solana was like kind of our,
all right, let's just do this V1 there
kind of proof of concept.
We were back then pretty much the first Oracle
to go live on Solana in 2021.
The other one was also a native to Solana switchboard
and overall and Solana grew from pretty much zero
to a few billions in TVL.
So overall we kind of had most of the market share
for Oracle and Solana by the end of 2021.
But like everyone knows what happened to Solana over 2022
like downtime, et cetera, because too much bots,
spamming, NFT meant the various reasons.
And our goal wasn't to just do like deliver
And actually one of the things I very much like about PIP
is wherever you build, actually we don't care that much
as long as you have the tooling for you.
if as we created those price feeds on Solana
when Solana is down, we cannot create price feeds.
When all your user, all the apps that use you
on the same blockchain that is down, it's actually fine.
No one needs a price because anyway, they cannot do anything.
But the problem is when you try to actually
have non Solana apps, use these prices.
Like what if Solana goes down again,
you don't create prices, they cannot be you.
Like we pretty much break the application.
And so here came the idea of let's fork Solana
have this kind of, we kept the same initial V1 design,
but on our permission chain where you don't have NFT means
you don't have gas costs like on Solana,
But when Solana goes to 200
and the PIP Oracle does a lot of transaction,
like it starts to not be sustainable.
And just for reference, I think that's at some point,
if you of all the real transaction on Solana,
if we remove the consensus for many months,
if not a full year or more,
the PIP Oracle was actually doing
half the transaction or more.
And arguably it's not cheap.
And so having our kind of permission blockchain,
reusing kind of the same tech, just also reduced the cost
and reduced the reliance on a public blockchain like Solana.
And from there we could just leverage
So it was kind of a idea was great to do on chain.
And as we try to be on all the blockchain,
we saw the bottleneck or the potential risk.
And for long-term it makes more sense
having your own kind of app chain
where you do the core business itself.
Okay, that makes total sense.
I guess transitioning over to the movement team.
I had one more question for Mark here.
Mark, so something we obviously talked to a lot
of like DeFi protocols and also teams outside of DeFi.
And Oracle fees always come up.
Love to understand a little.
And one question that keeps coming up is like,
who are the kind of data providers behind these fees?
How can we verify that they're pushing
that right data on chain?
Could you maybe dive into that?
Like, how do you guys onboard data providers today?
I know that you guys try to keep it
pretty transparent on the side, right?
Keeping at least a list of, you know,
currently onboarded partners there.
And also like, you know, what is the process for them
to actually push through that on chain?
And, you know, if there's a way you're incentivizing it,
right, with either the pit token or something else,
love to learn a little bit more about that as well.
So we have today like 100-ish data publisher.
We have various type of data publisher on the PIF network.
It's gonna range from exchanges,
so crypto-centralized, crypto-decentralized,
and also actually US stock exchange,
all providing data to most of the equities
feed we have on PIF or the ETFs.
On the crypto-centralized exchange side,
we have pretty much all the big boys you can think of,
Binance, Bybit, Kucoin, et cetera.
And the DEXs, we have bunch of Solana ones
because initially we're on Solana one.
Recently, onboarded trader Joe as a data publisher.
We have like Osmosis for the Cosmos ecosystem.
So very much we want to have all like an exchange,
like for you to know, like usually they own the order book.
And so from this, they can derive the price.
Like a midpoint between bid and ask is arguably the price
and they will, Binance, for example, for Bitcoin,
they're gonna keep streaming their midpoint
The other side of publisher is more like traders,
trading firms, and after whether they do OTC,
market making, or any type of special services.
So these people, and we have big names,
whether it's crypto or TratFi.
So on the TratFi, if something like that,
we have like Virtue, Tower, Jump Trading, Jane Street,
which are like all kind of top 10 worldwide
We also have very crypto native,
Wintermute, QCP, Oros, like various kind of structure,
Why would, and so these publishers,
especially the trading firms, like,
and again, if you're kind of a TratFi native
or interested there, for the longest time,
they kind of hated on exchanges,
especially on the US stock side,
because you have to pay pretty much to get connected.
And many traders were kind of feeling like,
we pay millions of dollar a year to many exchanges
And actually the exchange data, when you think about it,
it's just repackaged data of the trader.
Like a trader is going to put bid and asks.
And overall, you kind of pay for something
that without you, the exchange would be debt.
And so that's like in the US,
it's a big topic of like financial data is too expensive.
And so the fifth idea of like onboarding,
potentially this type of financial service providers
to contribute their proprietary data
to actually make a revenue.
So almost they kind of become what the exchange were to them,
a way to monetize their data.
And why can they, if you take the example
of like Chain Street, they're going to trade Apple stock
on like 10 different exchange every millisecond.
And based on all their fields, which it's data that own,
they can kind of create a synthetic price
of our derivative price there
and publish this fully legally.
So these all the piece price feeds
end up being a mesh of exchanges and traders.
So you can see on the website,
I think the most visually speaking
will be if you go on the website,
you open any price feed page below the chart,
you'll see like what we call price components,
but you'll see all the public wallets,
public wallets of all the publisher
and the current price history.
So you see right now the wallet addresses.
One day we might actually, and by we,
because after I'll take on the, how do we onboard them?
We might just de-anonymize this.
Initially we just put this,
you can see all the inputs live directly on the website.
So how do we onboard publisher?
So up until, let's say token going live,
the Peace Data Association
is kind of the seaward of the network
where we onboarded publisher.
And what did we care about?
We care about, do you own your data?
Because like for plenty of legal reason,
you're not allowed to just fetch data somewhere,
submit it elsewhere and just make out a buck out of it.
and also like we wanted to have data
or price feeds outside of crypto.
And you're not gonna be able to find
live Apple stock prices out there.
You need to either be connected to SIBO,
or you need to ask those usually those trading firms
that you don't even know where their office is.
And so having this very much like enabled to,
not only support crypto assets,
but ETFs, FX, commodities, stocks,
and soon US rate yield, et cetera.
So in beginning very much the association
was kind of vetting those publishers.
In November, the PIF token was launched.
The staking slash governance contract is out.
So everything kind of all the big updates happened there.
And we're currently discussing the whole constitution
Hopefully it's gonna get verified
or it is voted on chain next week.
And if it's interesting for you going out discord,
we have our all feedback loop there.
And once let's say this DAO is fully formed
plus like the councils would be in charge
of onboarding the publisher.
So in the future, potential publisher would arguably
just come to our discord, let's say,
kind of chat and put out a proposal.
We'd like to become a PIF publisher.
And after they have to convince the DAO
to accept that privilege.
Hopefully we can onboard some more of our DEXs.
So as soon as that sounds like they'll soon have
to submit some governance proposals,
which we can obviously help them out with.
Before we hop over to the movement team,
just cause Mark, you mentioned something
that I actually wanted to ask about.
You guys did release a token recently.
What's the purpose of that token,
the utility, that sort of thing?
Sure, surely governance token.
So now we have all the multi-sick power now replaced by,
for example, the whole smart contract upgrade, et cetera.
And after we'll see what the DAO decide to do.
But for example, what can governance decide?
So another kind of cool thing about the PIF pool model is,
so first the Oracle doesn't incur gas, like gas costs,
which is I think like 90% of any other Oracle,
But as people pull price updates
and interact with the P smart contract,
applications have to pay an update fee,
which is, which gets accurate to the smart contract
and it's owned by the DAO.
And so for now the update fee is one way on all the chains
because like we designed it like this the first day.
But I guess as the DAO gets,
the constitution gets ratified,
the first councils get set up, et cetera.
The first topic I'm expected to see the stakers bring up
is, all right, let's increase the fees.
Because at the end of the day,
it's how, let's say the revenue generation
of the network's gonna happen.
So overall it's gonna, yeah,
going from one way to how many ways.
And this will be done on a per blockchain
and per price feed basis.
Okay, so I guess transitioning from PIF to movement,
Cooper, I was wondering if you could give us
kind of an overview of what you guys
are building over at movement.
Maybe diving deep into what M1 is, what M2 is,
why Avalanche, that sort of thing.
And quick shout out to the host
for helping me deal with these technical difficulties
for getting back in as a speaker.
I mean, yeah, we were founded
to decentralize the move language
with a vision of move being accessible
as well as this bigger vision
that the tech you build with
should not silo you into any given place.
I'm so kind of pursuant to this modular thesis,
we wanted builders to be able to have
whatever building blocks they wanted to bring together
and create novel applications around
and then be able to share that with everyone
and kind of like an Amazon here.
So what initially brought us to Avalanche,
well, firstly, we were scoping out networks
that had the ability to support multiple virtual machines
and be able to create highly customizable,
not only blockchains, but also blockchain frameworks
and for future folks to create app chains with.
And so the two networks that stood out the most
for making that possible were Avalanche and Cosmos,
where Cosmos is really cool,
really decentralized, has a lot of technology
or the creation of app chains and creation of frameworks.
However, each app chain was its own independent environment.
There wasn't necessarily anything unifying
Whereas with the Avalanche model,
not only do you have the ability
to create a very customizable network,
even with different virtual machines,
but then there's a lot of technology
that's been built by the Avalanche team
and Avalanche Labs as well
that fosters and facilitates the use of these.
So one of the big things is work messaging,
enabling the seamless communication
between different blockchain environments
on top of the Avalanche network.
And particularly, like this is VM agnostic, right?
So it doesn't matter if it's a move network
talking to a Solidity or a Rust-based network,
they're able to really easily communicate with each other.
And then having this shared environment of users,
shared environment of liquidity,
and the user experience for the folks interacting
with these different blockchains
meant that if you're launching a new application,
you're not starting from zero,
with a hundred other very well-established applications
for their piece of the pie.
Rather, it's something that's been designed
to throw the pie together in every piece of the tech
for the Avalanche subnet stack
and the surrounding tech that's recently been pushed
You can really see that in the product design
that's designed to be like a common tide
to raise all ships and designed to promote connectivity
and really lean into this advantage
that Avalanche has over a lot of other blockchain networks
So some really cool things that we're going to do
with M1 on Avalanche as well.
And we're starting to build out our decentralized sequencer,
which is then going to be able to extend Snowman consensus
So what's really cool about Snowman here
is that the next great leap forward
somewhat to how move is the next leap forward from Solidity,
you have a faster consensus mechanism.
So you're able to see instant finality
on Avalanche and subsecond as well.
Then additionally, it's significantly more scalable
where with Nakamoto consensus,
you have to pull a majority of your validators.
So the more validators you have, the longer it takes.
So a lot of networks, the cap of 100 validators,
because if you go past that,
it starts becoming a pain point for the user experience.
they have a gossip-based model for the Snowman consensus
such that they have over a thousand validators right now
for a very decentralized network
without impacting the latency or the performance
or time it takes to achieve consensus here.
So what we're planning to do with that
is we can then become movement network
into our roll up on Ethereum,
and then become the pinnacle of modularity
in which we're going to have the best consensus.
We're going to have the best execution environment.
We're going to have the best data availability options.
One of the big ones who we work really closely with
is Celestia, so really excited
for all of the fun things with Kia
and the fun things that Milky Way is enabling
and movement as well for your stake at Kia as well.
And then have the settlement of your choice.
So really excited for what's to come.
Avalanche has a lot of the best tech that's come to market.
Recently of any blockchains,
they're coming as well as a really awesome builder
environment, so incredibly welcoming team,
incredibly helpful when we had any questions
regarding the things that we were building with
and really just a great place to build.
So I also want to shout out the recent programs
for new crypto builders to kind of dive
into these crypto schools and accelerators here.
Yeah, thanks for that overview, Cooper.
I really know, obviously we do appreciate that.
You guys have chosen Star and Avalanche
and use it as a home base.
But personally, I'm really impressed
by kind of the modular design that you guys have built up
and that allows you to collaborate
with partners across different ecosystems
and also with different service providers
and be able to host different types of projects.
I would say like, haven't been on Avalanche for a while.
Most of our community is very familiar with EVM
and just EVM chains in general
and probably less familiar with non-EVM chains, right?
Some may have like an obviously dabbled in Cosmos Salon
of maybe even Polkadot back in the day.
But I would say if you have really took a deep dive
into move and how that differentiates,
how move smart contracts differentiate from Solidity
Could you maybe give like a quick primer on that?
I know Rushi has been out there evangelizing move
But we'd love if you can, from your perspective,
describe the kind of like the benefits and differences
between move and Solidity,
what more people here are used to.
The way that I think about Solidity versus move,
with Solidity, it was a great leap forward at its time,
being the first smart contract development language ever.
But that was almost a decade ago.
And so if you think about taking such great leaps forward,
if your target is thousands of miles away,
it's gonna be really difficult to hit the bullseye,
in which Web3 is starting to orient itself
more towards consumer, more towards real world assets,
and these use cases where ownership is very, very important.
And that's something in which the way
that Solidity contracts are written,
assets, when they're going from one place to another,
kind of disappearing and then popping out from the ether
and in that space between,
they're riddled with vulnerabilities and potential exploits.
Exploits can be so esoteric in nature
that the Kyber attack, for example,
was something that is statistically impossible.
And then you see so many different forks
and so many protocols built with this code
that people thought was secure,
that suddenly you have exploits rippling
and you have no idea to be able to tell what is safe
and what's about to be hacked by some giga brain
So where Move steps in is that it was built by Facebook
They wanted to onboard their institutional finance partners
and their big tech partners to put hundreds of millions,
if not billions of dollars on chain,
as well as so much sensitive data.
So they created a language that was not only more intuitive
so that it's easier to learn.
And when you're showing this to a bank,
it's going to be much easier to explain
how these contracts are working.
But additionally, it's broken down
into what's called modules.
And within these modules, you're defining,
firstly, who has access to them.
So what contracts or what addresses
are going to be able to interact with this.
And then secondly, from that module,
where assets can flow to and from.
So you're essentially creating pipelines
with hard-coded guardrails as to who has access
and where they can flow to and from.
So if, for example, you had a vault and that vault,
you wouldn't be able, firstly, an external address
wouldn't be able to access that vault.
They would have to gain access to an authorized address.
And then secondly, they wouldn't be able
to withdraw those funds to wherever they wanted to.
They could only move them in the context
of the logic of the module.
So it's significantly safer in addition
to the way that the move language was developed
It prevents the most common attack vectors
that we see in Web3 being re-entrancy attacks,
as well as some other vulnerabilities
that plague the Solidity language.
And so what we've found is move builders
is that move, it was built by Facebook, right?
They poured countless millions of dollars
into making a next iteration of Solidity
based on the Rust language,
which is also known for its security
and its good use cases around things like assets,
being able to have additional type safety there.
So it kind of beats Solidity in every single way,
except in terms of adoption.
And that's where, when we're thinking of builders
being able to build with whatever tech stack
that's the sort of thing in which we felt particularly
as a pain point where a lot of chains
they're geared only towards Solidity,
only towards a certain stack.
And so really wanting to modularize the move language
and make it accessible around the world.
We want to spread the gospel
of move security and performance.
And just speaking really tactically,
you mentioned M1 will be the first deployment
from a chain perspective, right, for a movement.
Could you maybe direct people
or let people here know how to access the DevNet?
I understand that's live.
I think M2 DevNet might also be live, right?
And then also, just kind of a future plans for testnet
and eventually mainnet as well.
So we've got a few applications deployed
onto our DevNet right now.
Interest protocol is one of the really exciting ones.
You can also get a wallet either from RazorDAO or a desig.
And both of these wallets,
they're gonna help you like support with the AppDA suite.
But as we get closer to testnet
where we've announced our part the non roadmap
of what it's going to look like as we get to that place,
we're firstly going to be doing a lot of stress testing
So kind of onboarding the pre-torian guard,
if you will, of validators.
Some folks who have brought networks from zero to one
before and are assisting us in that process
of everything that needs tested there.
Then we're going to slowly grow
and decentralize the validator set,
including onboarding some folks from the ecosystem
and from the community as well.
And then we'll be going into the period
in which folks will be able to,
you know, it will be properly opened up testnet.
Folks will be able to interact
with the different applications
that are building on top of us.
If you want to see a sneak peek at what this ecosystem
highly encourage y'all to go check out Movement Labs.
And we're going to be holding spaces later today
And we'll be dropping some big alpha later as well.
So you can get a peek of what's to come
once we do open up the testnet to everyone.
As well as you can see the part the non roadmap there.
Yeah, and also not to mention,
we're doing the social and consumer hackathon with you all.
So hopefully we'll get some more builders as part of that.
And, you know, eventually more infra
and framework partners as well.
Yeah, we're so excited for that hackathon as well.
Really honored to be posting some bounties.
It's one of the few ways right now
that folks are going to be able
to earn future network tokens.
So we love giving grants to builders.
We're really looking for folks
who want to build frameworks for developers.
Kind of like a public goods such that other builders
who want to create like social applications,
consumer applications are going to have an easier time
bringing those to market,
as well as being like a public good
But also if you want to build, you know,
any social fight applications or, you know,
something itself, we're also really excited
to be able to offer grants and bounties and whatnot
in a way that like you can build what you want to build
and we're going to support it
and find a way to, you know, get people using it.
So really encourage everyone as well
to check out that door hacks page
for the Avalanche consumer hackathon going on
and then get in touch with the team
of anything that y'all want to build.
And as Coop said, it's on the door hacks website
that's hosted there and it's called Avalanche Frontier
for those of you interested.
Feel free to also DM myself, Mack, Coop, you know,
Yeah, so while we have 10 more minutes,
wanted to kind of turn it over to ending remarks
in terms of like how Pith and Movement
are collaborating together.
I know you guys just announced a partnership,
I believe within the past month, I guess,
what does that look like?
How does Pith integrate into the move stack, et cetera?
Yeah, so the move stack is essentially the way that
it's the culmination of all of the toolkits
and all of the opportunities for developers
So what's really cool about this Pith integration
into the move stack is suddenly all of the applications,
all of the folks that are plugged in
to our Movement network are now able to seamlessly integrate
Pith price feeds into their data.
And so it's a way that for any folks who are integrated
into here, they're suddenly growing their reach
of applications out of the box in a snap.
And then as the move stack and their open framework
gets integrated by networks and folks
that then increases the reach of their customer base there,
So it's really exciting that a lot of the applications
within our network are going to have access
to the highest quality price feeds,
as well as even be able to contribute
and create further products around these, as well.
being already live on Aptos in Sui,
having Movement kind of trailblazing
with the move environment, top of Avalanche,
like it was very kind of a clear,
let's say, home run, even if we haven't hit the ball yet,
like it's gonna bring many new type of developers
to the Avalanche ecosystem, so that's super exciting.
And just like anyone that I mentioned it earlier,
and at the end, like with the Pith model of,
we just need to deploy a smart contract to a new chain
We wanna be everywhere just to enable,
like regardless where the next Avic compound
or whatever app you like would get created,
end of the day, we want to be part of this.
And like Movement Labs, the team is amazing.
Like Move Code is amazing too.
So very much looking forward to these
like new type of coding environment, language
on top of existing EVMs, so very excited by that.
Awesome, I guess kind of transitioning now,
if anyone wants to hop up from the audience
and ask any questions, Kyle can move you up,
just request and he can kind of get on that.
It doesn't look like we have any requests so far,
but we got a couple minutes here.
So if anybody has a question that they wanted to ask
either of our guests here,
obviously we wanna like stick to the topics here
that we've been discussing,
feel free to request up, hit the mic button on the bottom
We probably have time for a question or two,
but there's lots to dive in here
and lots of amazing information that was shared today.
So even beyond this, I'm sure there's plenty of ways
to get connected with either Cooper or Mark or their teams.
I know both of their actual brand accounts
are here in the audience as well.
So yeah, there's definitely ways to get connected.
We do have one question that's popping up here.
So, okay, we got it, had it up.
So, three, you're up, what's going on?
Hey, what's going on guys?
Good, good, good, welcome.
I just wanted to say congrats to Avalanche's success
and what they've been able to build so far
and what they got going up in the future.
We were interested in possibly seeing
as a indie development team,
what type of infrastructure support
or anything that could be offered
for a team joining their infrastructure
if they're thinking about it or the ways to do so.
I heard that there was a link.
I'm wondering maybe you guys could pin it
or send it to me in a DM.
We're highly interested in seeing what's out there.
We've been building on Polygon
and we have a cool little project
that we're kind of finalizing,
but we could maybe think about moving it to another chain
that could be more perceptive
or appreciative of the hard work behind it as well.
We've had a hard time getting in contact with Polygon itself
and so it'd be cool to actually be able to see the hard work
and actually work with the infrastructure behind the veil
who's put on the hard work and communicate with them.
So yeah, it'd be awesome to connect us.
Cool, thanks for coming up
and thanks for showing interest in building on Avalanche.
So Eric or Matt, who are up here as speakers,
they can definitely help get connected through DMs
after this and figure out what the next steps would be there.
So they're the best and Eric, if you guys wanna reach out,
Yeah, it might be worth pinning code base as well,
which is our own Avalanche-based incubator
for teams wanting to build here.
It's kind of, it walks teams through their very earliest stages
helping with strategy, economics,
what successful product market fit looks like,
as well as like raising around, that sort of thing.
So it might be worth pinning here for that application as well.
Yeah, let me grab that tweet and we'll pin it up here.
And I'd love to throw one more thing.
So basically, we're pretty much done though too.
We only have a few minutes left.
So I wanted to see if you had any questions for Mark or Cooper here,
just to honor their time and we have to wrap up pretty soon here.
So I just wanted to kind of keep the conversation focused
if you have any questions for Cooper Mark here.
Yeah, but yeah, I appreciate you coming out, man.
I saw one come up and then it dropped down.
I think we should wrap things up here.
We're gonna jump over to our in the lab,
kind of weekly Avalanche community forums.
So for those of you that might have questions
about Avalanche overall or how to get it connected to the team
or learn about things that are just happening on the network,
we'll open up another room for in the lab.
And that's a much more open forum
where you can ask lots of different questions.
And so we'd love to connect there too,
but let's wrap things up for this one
because I wanted to see if Mark and Cooper
had any final thoughts they wanted to share
or anything they wanted to share,
ways they can get connected with your teams.
So Pithwise, I guess best places that is called,
that's where we try to keep the Pithians
aware of what's happening,
especially now with the whole,
say Dow constitution feedback.
let's say a big discussion takes place.
I think I'm not seeing the Pith account in there,
I have the Pith Twitter linked.
Find a discord or pith.network on internet.
You'll find the discord there.
Come ask us your question.
if you're just liking to use DeFi,
think following along Pith is a great way
to just keep in the loop of what's happening.
We have users on close to 50 chains.
So whatever chain you like,
there's a good chance that there is one on your,
Now I'm just like super excited,
we had the growth of market share
on many, many chains of this year.
One we've been lagging on is Avalanche
like this movement actually,
that's a good way to see it.
I'm very looking forward to see it
and just help people build
and use the best possible DeFi on Avalanche and movement.
Awesome, yeah, likewise, it's an honor to get to support
some really powerful and cutting edge tech stacks
from Avalanche, really cool modularization
and the way that there's ownership structures
and can work as well from Pith
and cool to see all of this culminating within movement
and can't wait to share more with y'all.
Cannot stress enough, check out our movement page,
dive into our docs and people, our spaces coming up
because you're gonna get a lot of the alpha there
Yeah, awesome, yeah, there's so much good information
and resources to dive into both of these projects
and it's amazing to see the success
that both of you have had so far.
And again, we just appreciate you coming on
introducing what you're doing to the Avalanche community
I'm sure there's gonna be plenty more opportunities
where we're both kind of collaborating
and share with each other they're working on.
So more to come, more spaces,
Coop and Mark really appreciate you coming through
and Matt and Eric as always leading the conversation
and just going through what you guys have been working on.
So we're gonna wrap things up here,
I know we have a couple people that wanted to come up
but we gotta wrap things up
to jump over to our other spaces.
So in about two minutes here,
we'll be launching in the lab, our community forum.
But for now, wanted to sign off, thanks Cooper,
thanks Mark, we appreciate you guys.
Thank you all, thanks for tuning in.