Aptos Move Madness 🌐🏀 #AptosAllAccess Takeover 🔥🏆

Recorded: April 2, 2025 Duration: 1:29:06
Space Recording

Short Summary

The discussion highlights several key developments in the Aptos ecosystem, including the launch of new projects like Mirage and Flip Vault, and innovations such as Move 2 features, codeloader v2, and account abstraction. These advancements aim to enhance blockchain technology, improve user experience, and provide new opportunities in the DeFi space.

Full Transcription

Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. GM GM, how's everyone doing?
Happy Composability Wednesday.
Oh my gosh, Alex finally made it.
We were worried that you got rung-powered.
No, I was here.
I was a speaker, and then I got de-promoted, and then you invited me, and I came back up.
I'm glad that you're here.
I'm glad that we're all going to talk about the amazing move madness that we're in the midst of.
and you know in for quite a while but you know let's let's talk about that this is uh space
We've been in for quite a while, but let's talk about that.
dedicated towards all the builders and all the creativity that's going into to move and what
we're doing in the context of making move on aptos the best development playground for web3
and so you know we brought together a lot of really great folks here to talk about that with us here today. We've got Akania, Akana,
Baraj, Merkle Trade, Everestic, and Flipkalt here. It's really exciting to have all these
folks here today. I think we've got Sneha and me who are going to be the run of the show
for you folks. Sneha, I think this might be your first time on All Access. You have to do a quick intro for yourself.
Yeah, this is pretty exciting.
This Move madness edition that we have here.
So I'm Sneha, I'm the ecosystem devril
at the Aptos Foundation.
And today, David and I will be hosting this session
to walk you through all of the main amazing features
that we are coming up with Move 2 to along with these teams that we have yeah so David do you want to take
it away certainly certainly and I just want to call out that it's it's a really
unfortunate that our good colleague is unfortunately under the weather today
or she just messaged me that she's abandoned me. Pretty harsh words,
but that's the type of spirit that she brings to the game. I hope she feels better.
We'll do our damnedest in her absence to make sure that we have a nice run of show. But just to be clear, we'll be running a little subpar without Elise's passion and energy that
she brings to it. But really excited to have you here, Steha. And it's going to be really fun to talk with all of our guests. So just to, I guess, quickly
kind of go over what the plans are for today. We're going to do a quick overview of what
is all this madness that's going on. Where did we start with MOVE? What did we see over
the past, I guess it's really been about eight months since we've seen the transition from the legacy of where we began with move 1.0 or what was a PAPTAS move at
Genesis to where we are today and then all the other things that are going to be coming thereafter
that we're going to be getting into a bit more depth with our speakers that have joined us here
today and I'm just going to go through a bunch of questions that we've kind of lined up that get into a bit more depth with our speakers that have joined us here today.
And then we're just going to go through a bunch of questions that we've kind of lined up that get into the depth of what this transition from, you know, the legacy or the genesis of Aptos and
move to where we are today and how that's impacting our DeFi developers. And we'd love to
save a handful of time towards the end to bring folks within the audience
to come up and either ask us or our speakers any questions that might be on their mind.
And so, you know, without further ado, I think it makes sense to kind of just jump right
And, you know, I'll kind of kick it, bring it right up to the stage to potentially jump
in here as well.
I just kind of want to walk everybody back
to the history of Aptos, the history of Move,
and then kind of bring you up to a conversation
between Brian and myself and what's these new features,
these new functionalities that we're bringing to Move,
meaning the broader spectrum of things.
To go back, Move was created at Facebook back in the days for the
Libra and then the DM projects with the intention of recognizing that existing paradigms, and
particularly in that day, it was largely Solidity, Viper, maybe a little Cosmoza, maybe a little bit
of Rust, but those were still pretty nascent at that point in time. So it's really just the
Solidity and Viper crowds. and all the hacks that we are consistently
seeing inside the Web3 space, and not because of the underlying VMs, but much because of
how the languages presented themselves and the way the developers were using those languages.
And in fact, most of these languages, while being Web3 languages, weren't initially or
originally concocted with the whole mentality of we need to be thinking about how Web3 developers are going to be building their applications, but much more about just solving that immediate problem that there was no way to actually construct Web3 applications. This resulted in things like the ability to have double spending or even recursive spending,
so you could infinitely just trade accounts, or just mistakes in programming that would allow for
bad things to happen, such as funds locked into accounts and never retrievable thereafter.
And so just looking at all these quirks and problems, the team at Facebook said,
we can do much better.
And they focused on a language that was much more about resources, about data preservation,
about using high-level constructs like structs to represent underlying stores of value.
And so that brought into the idea of Move to where users or accounts effectively really own the data that's associated with them or the assets that are associated with
them, rather than just the simple context that you see in
something like solidity, where it's like literally address
maps to value. And that's the basic underlying credit that you
find pervasive inside the solidity space, along with all
the crazy stuff that you see on top of that. And so once you
know, for those that don't know,
we made a great evaluate efforts to get Libra and VM launched.
I will say we're actually really fortunate
that they didn't launch because the direction
that those projects had ultimately ended up taking
was an excessively permissioned environment
that would have never actually allowed to move
to be used by the broader Web3 community.
It was going to be created for a
consortium a small federation of say 30 to 40 blessed vendors that could go in and actually
even interact with the blockchain and move let alone the smaller population that would have
actually been able to write contracts on chain and so when Libra and Diem failed to launch, we had the great minds, we had Aptos,
we had SWE come together, or I guess independently, and say, we're going to make the best of the
circumstances. And that brought forth what is now Aptos roughly early February 2022.
At that point in time, we had a huge mission of actually taking that legacy move and at least
starting to make it permissionless. And so that was kind of like a huge mission of actually taking that legacy move and at least starting to make it permissionless.
And so that was kind of like a huge initiative that we saw going into the first eight months of the blockchain, which is from February to October where we finally had our launch.
It was like, how do we take all these primitives? We make sure that they're safe for anybody to interact with.
We allow for publishing of modules or contracts on chain.
We allow for interoperability of creation of new tokens, of NFTs.
And mind you, a lot of that has now been erased.
We now moved away from the legacy coin, the fungible asset, except for Alex, who really
loves the coin.
I kind of love the perverseness of the coin, but it has its own problems.
And we moved away from the legacy NFT
that resulted in a lot of excessive data creation
to a simplified object model.
And there's been a lot of really cool stuff in that space,
but there was still a lot of hiccups
in terms of the developer experience
that we were seeing in that space.
And beyond just the developer experience,
there was issues in terms of textability.
And the developer slash user experience that can come along
with that. And so the team made dramatic efforts to build a new
compiler heading into I would say roughly about a year, year and
a half ago. And the reason the big motivation for that was the
legacy compiler was multiple stages of complexity. I think there was
five or so different layers of
compilation that ultimately took a move source code and
converted it into the pipeline.
And as you can imagine, that amount of excessive complexity
complexity means that if you introduce a new concept such as
the ability to implement closures or dynamic dispatch,
you're gonna effectively have to implement that
at every single layer,
which means you're gonna potentially introduce bugs
and all these other castes,
and the team said that's no way,
that's a non-starter for us.
And so that motivated the new compiler.
The new compiler finally launched in roughly,
I guess roughly eight months ago,
which is the birth
of move two.
Move two, we saw the great new constructs such as enooms come to the forefront.
Enooms have been really grateful for those that are looking for data upgradability because
now when you have a new move program and you want to say, here's the way I want to represent
the user's identity in that program on day zero. day 30. You can say, oh crap, we forgot
all these fields or these different properties that we
want associated with that user and you can seamlessly just
upgrade that user to a new variance of that data structure
without going through all the pains that are part of legacy
move, which is like effectively you have to have the access
to the user's underlying account, which is their signer, to be able to extract and insert
new data. Instead of that, we can seamlessly just upgrade to the experience and everybody's
really happy. So that was the beginning and we recognize that there's a heck of a lot
of more work to be done. In particular, you know, we've got this nice list here. And Brian,
do you want to kind of quickly go through and talk about what the big features are that we're seeing coming out
of, you know, post move to when it's 2.1, 2.2, and what is this
move madness we're in?
I can't hear Brian. I don't know if he's able to speak.
Hello? Hello?
There we go.
Yeah, I'm Brian.
I'm a dev role engineer here at AppTis Labs.
Yeah, seeing some familiar faces, Alex, Munchiesty, Patrick.
So, yeah, with Move 2, I mean, I think there's a set of, let me count here, like seven new features.
I mean, recently we discussed dynamic script composer,
so that's kind of like sweepy TVs.
I think we went in depth with Moonshyste there.
That's the one where you can call multiple move functions
into one transactions through the TSSDK.
I think that one's cool,
but the one that's a little bit more interesting to me
that I don't think a lot of people are talking about
is codeloader v2.
Just to dive into it really quickly,
it's like kind of this concept where,
I think reading the Medium article,
it's claiming, I think, 60% faster block times, and then potentially with blocks containing
transactions that upgraded to the new move code, potentially 10x faster.
So I'm really interested to see what type of benchmarks we can get out of this.
But essentially, the old designer, in order to pull a module from storage, you had to,
you know, individually go back into a disk, deserialize it, verify the bytecode, and then
with that individual thread, it'll be loaded in the private cache. So Loader V2 now has this
concept where, you know, multiple threads within Block.Sane can share a cache. And it also goes a little bit deeper where it starts using L3 to L1 cache.
And L3 stores modules for one epoch.
L2 is block-wide cache, and L1 is a thread local cache.
So I think that's quite interesting.
And I think another thing it dives into is the concept of also upgrading modules,
which I didn't previously
think about but let's say in a single block you have two transactions right one is upgrading a
module and another one is calling a function from that module so that's that's interesting because
if these things run in parallel you now enter a state like hey is it is this calling a function
from the the old module that was not previously upgraded?
Or was it calling the new upgraded module?
So now you have to revert back to sequential execution.
And that's too slow.
That's things that are done on previous chains.
So I think with Loader V2, now you can now have these things operate in parallel.
And before the commitment stage, it'll check, like, hey, were there any conflicts here?
Did we call a function from an old module?
So I think that's quite interesting.
I might write a thread about it.
Yeah, I think I'm very excited for Move V2.
I wonder what maybe Sneha thinks or one of our ecosystem partner thinks.
So happy to pass it to someone else.
Yeah, I think that's the interesting points that you made there, Brian, especially about the loader.
I'm also very interested to talk a little bit about our mutation tester, which has come into play.
little bit about our mutation tester, which has come into play.
And so the way that we think about testing in move code, and there has
been a previous blog post by APTOS as well on this, which talks about how
testing in terms of coverage is not enough.
Whenever we are going through a smart contract, we're talking about, okay,
these are the different lines that we're trying to target and write some
unit tests around.
But that might not completely talk about all of the different possibilities in terms of the inputs that can be passed to a particular test function, right?
And that's where the mutation tester kind of comes into play, which I think is an absolute game changer,
especially when we are talking about DeFi applications and almost all of the top tier DeFi applications on Aptos are here.
So I'm very excited to talk to the teams about this as well.
So in order to be able to capture the different kind of set of values
that can be passed to these tests,
we have the mutation tester,
which will actually mutate and try out different set of inputs
and test out
these different functions, obviously for the ones that are under the coverage. And I think that is
a pretty interesting one. Another one that I'm particularly excited about and had created a demo
around it a while back is the account abstraction. So account abstraction, again, I think, especially
in the EVM ecosystem,
this has been talked about for a long, long time. And then now we have moved to the
chain abstraction narrative as well. But I think talking purely about account abstraction and the
way Aptos has captured this in a very native and a very simple manner has the ability to transform the DeFi experience completely,
the onboarding experience and not just limited to it,
it can also be for consumer applications as well.
Especially talking about our account abstraction
consisting of an authentication function
where you can essentially write any logic
and you can allow to just allow the transaction
or abort the transaction based on that authentication function. So you can allow a certain
set of public keys to execute on behalf or move in some other direction as well. And this can
exactly, as I said, like change completely the flow of DeFi applications. So yeah, these are a couple of different features that are part of Move2.
And now I think is a great time to actually introduce all of our guests and ecosystem
speakers today and get to know more from them on how we can best leverage all of the things
that we just spoke about. So I think we can just give off
just a round of applause for all of the guest speakers that we have today.
So we have Alex here from Iconia.
We have Anto from Kana Labs.
So, and we have Julian from Mirage.
We have Mark from Flip Vault, Patrick, Merkle Trade, and from Everstake.
So, hi everyone.
Thank you so much for making the time.
So, I would love if we can just have a quick round of introductions
from all of these amazing guests that we have today from the top P5 projects. So let's kick
it off with Alex from Econia. Alex. Hey, thanks for the intro and thanks to Aptos for having me.
I want to start by responding to david's quip earlier
about how i'm resisting fa and i will just say that the reason we've built two of our products
in coin is because fa was not available yet and i was still making suggestions on the prs that
were landing like i will moat uh i think a certain minimum amount transferred on dispatchable
fungible assets so um yeah david we we are building stuff with FA now because it's finally
a stable build. We've just been kind of early to some of this stuff. So yeah, in terms of move to
language features, we actually have a feature that we're rolling out in the next few weeks.
We are just finishing our alpha testing on it right now. It's called EmojiCoin Arena. So
anyone who's played with EmojiCoin.fun has had fun engaging with different EmojiCoins,
maybe over a longer term time horizon. But for people who are interested in doing more active trading and shorter term engagement with other people in a sort of friendly, playful environment, we're coming up with Emoji Coin Arena.
And it actually uses AppDOS randomness.
So one of the cool features is that roughly every day, two random emoji coin markets are selected and then users can
trade against them in a special event called a melee.
And there's also rewards provided by Optus Foundation.
So if you lock in early, you can get a portion of the APT that you provide matched and you
can use it to trade emoji coins.
And this includes price charts for one relative to the other.
So it sort of answers this age old question of what are we trading today?
Which is the oft repeated quote in the group chats.
And beyond that, you know, a lot of the designs we're using
are already starting to use things like receiver style, syntax,
and enums and the like.
And we're generally just excited for the features to keep improving.
One thing that I don't recall hearing mentioned,
but which I think is some of the most important stuff is the memory management,
in particular storage slots and things like Memoreplace.
So we're looking forward to using those in custom data structures that include
some of the new order book technology that we're already working on internally.
So people might remember us most recently for EmojiCoin.fun,
but we previously built Akania as well,
which was a fully on-chain order book based on the coin standard.
And we're looking to revamp that and be a Akania X project we're working on,
which we use FA and include things like AMMs
and the liquidity pools that we've been working with
on EmojiCoin.fun.
And obviously we want to take all this new
high-performance move to language programming features
into everything we're building
and continue to deliver, you know,
fun, engaging UX for the DeFi ecosystem.
So I'll pass on to the next speaker.
Amazing, Alex.
Can we have Moon Shesti from Mirage talk a little bit about?
Hey, good morning, everybody.
Thank you for inviting me.
I'm really just here to tell you guys
Mirage is launching in a few weeks.
We already locked in all our contracts. We wrote
the project before Move 2, so we're holding off on picking up any of these new features so we can
launch as quickly as possible, but we're thinking about some cool new products to
build after launch, and of course we'll be using Move 2 for those. So stay tuned for
announcements in the next week about when we'll be using MoveTo for those. So stay tuned for announcements in the next week
about when we'll mainnet.
Awesome, can't wait for you guys to go on mainnet.
Can we have Andrew from Kana Labs?
Yes, hello everyone, thanks for having me here.
So we are currently building our Kana Pubs platform
and it is built on Econia.
And currently we are using our code abstraction.
Like we are using the traditional approach to use session keys for the order placements.
And currently we will be using, we are planning to use our code abstraction to order placings and everything else to provide
one click trading experience.
And one more thing I have to tell here is like the enums, like our contract started
with move one, like it started way before and we are still uh we are incorporating new features as as they come by and one thing i
have to uh like uh thank everyone for is like you know so it has been a like quite uh
big thing for us and it it is really helpful uh helpful also so i think uh that would be for it
and the next thing i would like to talk about is the dynamic script composes.
Like in AppTools, like the scripts,
it is always there, it was always been that,
and it is not used much as it is compared
to the normal transactions.
And with the dynamic script composes,
I think it will add more possibilities
to add some features like uh gasses transaction
as i have already talked about this uh earlier like uh the the users can pay gas fees with any
use you are any token like usd or whatever that uh that uh daps native robots so i think dynamic
script composer is a uh big thing also yeah, I think these are the things
that we are currently working on.
Yeah, thank you.
Thank you, Anto.
And very interesting to know that account abstraction
dynamic script composition is something
that you folks are already exploring at Kana Labs.
Can we have Patrick from Merkutrade?
Hello, can you hear me?
Yeah. Thank you for having me here.
I'm Patrick, co-founder of Marketrade.
Well, I'm not usually the one who gets to be here,
these kind of things on Marketrade, but today it's me.
So Marketrade is a gamified perp text.
We highly focused on the retail users first,
which is why we are focused on creating an innovative UX.
Well, we use native USGC,
CCTV to make direct deposit from six different EVM chains,
which is really good.
And we are currently working on making EVM-et connect again because we used to have that.
But now we want to use the account abstraction for users to
let use Mercure Trade with EVM Wallet like MetaMask directly.
And recently we added a refl feature with
Aptos randomness which is really good.
So we can see that many users love it.
So maybe you guys can try it out.
So, and I'm still proud and pretty excited
that we are one of the OG protocol in the AppTops.
Awesome, thanks, Patrick. And would love to check out the randomness stuff that you have cooked up on Murgle. Can we have Anne from Eversteak?
Hello, everyone.
First of all, sorry, maybe for my boys, because I caught a cold and it seems spring is the
most trickiest season of the year.
Secondly, my name is Anne.
And you may know me as Aptos Nerd.
I represent the Everstake team today.
And it is one of the largest validators on the market.
And a few words about us.
The company was founded in 2018 in Ukraine, so we are pretty experienced in what
we do and in June this year we will be celebrating our seventh anniversary. And also I'd like
to provide some key numbers to give the general impression about our stake. We have more than 700,000 users
sticked with us. We provide the validation services for more than 80 blockchains
and we also try to develop different stuff, different solutions, staking solutions,
for example for Ethereum. And I'm also really, really happy to be as is Twitter space and discuss
all things after us and move to features and improvements. Thank you.
That was a lovely introduction, Anne. Moving on finally to our last guest speaker for the
day. We have Mark from Flip Vault.
Good morning guys. Thanks so much for having me.
Very nice to meet everyone who doesn't know me. Just a brief background on ourselves.
We are building a live barter trading platform for all digital assets,
mainly NFTs and within NFTs, probably mostly gaming NFTs.
The idea was originally inspired by RuneScape,
as well as like other web two games like CSGO.
So it's being able to trade socially with other users
and barter your assets with one another in real time.
It's essentially what we're building.
We just announced our test net yesterday,
which will be live in a little under two weeks.
So we already have our contracts written up.
So we're currently not implementing any move to features,
but we already have a bunch of ideas in the works
and we'll probably be rolling those out
in the next month or two.
So stay tuned.
I think this is a very interesting project overall.
So yeah, pretty excited to see how that pans out.
So David, I think this is the time that we start unveiling
a couple of the different features that we discussed and talk to our guest speakers about
how they will best leverage all of this in their tech. I mean, you and Brian gave away so much
all our great speakers have as well. I guess I kind of want to be a little bit selfish really
quickly right now. And one of the technologies that I think are at the forefront of a lot of folks' minds
is the construct of account abstraction.
And so I want to go down this tangent very quickly and, you know, the other speakers
to kind of give us feedback.
But just to go into the depths of what you already spoke about account abstraction offers,
which is effectively you can bolt it down to existing accounts and allow for arbitrary means to authenticate that account which means you know
i take my current david.apt account add an account abstraction to allow me to say seamlessly trade
on verkle trade and that's a really great experience because now i don't need to go and
approve every single transaction in my wallet.
The caveat being here, there's another technology called permission signer that would effectively allow Merkle Trade to only interact with my account within certain limits, such as saying Merkle Trade can only access up to, say, 100 USDC and maybe 5 APT, and I can do whatever I want with inside of the Merkle trade contract.
Now, one of the challenges is account abstraction is fully
flushed, fully on chain, but permission signer is yet to be
fully built out because it's a complex technology that involves
a lot of nuances within the VM itself.
And so the team has effectively made a pause on the rollout of account abstraction out
of concerns that it could be potentially abused by the ecosystem, not necessarily by individuals
within the space, but just by some malicious entity in the ecosystem to backdoor access
into a user's account.
And so, you know, I know I want to go into the depths
of how you might be using account abstraction,
but I kind of want to see that with the conversation of,
can we imagine a way in which we can allow for account abstraction
without risking the users or at least educating users effectively?
And does it provide that much value versus another technology
that we're building on top of account
abstraction that is called effectively derived account abstraction that allows you to create
dedicated sub accounts for different applications that you can seamlessly move assets back and forth
into and give complete authority to that gap and so what this effective flow would mean when i go
to merkle trade instead of setting up a
permission signer, which doesn't exist yet, I would simply just
transfer over. I created an account dedicated for Merkle
trade that can only be authenticated when I go to
Merkle trade. And I would transfer over say that 100 USDC
and that 5 APT and the Merkle trade. I can seamlessly interact
with their DAP however I want.
Do we want to like I want to better understand how important it is to try to get account
abstraction out now, or does derived account abstraction largely deal with a lot of the
requirements and usability around account abstraction?
And with that, we can kind of do whatever we want with this.
I'm going to throw it first over to Patrick, since I've been talking to him and you a lot
about the nuances on this topic.
Oh, yeah. Actually, yeah,
we currently really looking for the how to
integrate the account abstraction and actually,
we also thinking about the pretty similar thing
what you're concerned.
Well, actually, the part is kind of risky to users because
somehow if, let's say, if there is some pitching site and make user gave them to like full power of the authority, then it's pretty messed up.
From our perspective,
if there is some scope,
that permission,
let's say that permission can call some of
contract or module or some,
or maybe functions. We currently, maybe that kind of integration will be pretty helpful.
So we built on our own. But if they just got hacked or their private keys
hacked or their private keys got stole by hackers,
then there is no way to prevent that.
But we still want to try out
the account abstraction because it's pretty,
let's say it's a session key feature.
User can do the transaction without any sign process.
Because in Merkutrade,
we are really high leverage perpetual trading.
So if user use like over 100 X leverage,
then they need to sign as soon as possible.
So we thinking about even there is some risky parts, we love to integrate and how it works,
how it goes. And of course, we really need to try hard to integrate the, like, make really safe.
how make really safe so yeah so when when i integrate the kind of section i somehow i
thinking about maybe user can see how um powerful is this and what kind of uh permission will go to
designer uh if user can see on the petrol or cetera, then that maybe could be help or something.
Yeah, I think it's going to be great once we have these features out.
The risk there is we thought about the Petra wallet, but then we have, you know, a handful
of wallets that are still pretty reasonably popular in the ecosystem.
Just think about the potential risks.
I think it'd be like 10 to 20% of the population.
We want to kind of hold back on that. So, like, I think the drive to kind of subtraction is going
to be rolling out to test that within the next few weeks. And then hopefully the main that,
you know, God willing sometime in May or really early June, but I do recognize. But with that,
I also want to see if there's anybody else on our panel that wants to talk briefly about what they were thinking about in front abstraction and the motivation for either the native kind of abstraction or if the drive to front abstraction would be useful or sufficient.
One thing that I think it would really be useful for is pooled liquidity vaults.
So typically when people want to provide liquidity to an AMM, they will just dump it in a pool
and then there will be a constant product curve and fees get reinvested and you earn
your pro rata share as an LP.
But for strategies that use order books or maybe bonding curves or AMM or a hybrid decks
that includes all three, like we're working on right now. It's not as simple.
And typically, you might want to have some kind of gamified marketplace where there's different strategies.
So this has been sort of popularized on Hyperliquid with the idea of liquidity vaults.
And most notably, the HLP vault that's been the subject of recent controversy
was one of these.
Basically, the Hyperliquid team was just running the algo behind it.
But if you have you
know a similar system where you can create a place for um you know anyone who wants to provide liquidity
to a market but doesn't necessarily want to actively trade then you can delegate different
sort of permissions to other actors so you could maybe have someone who's allowed to place trades
within you know some percentage point of the spread um many times per day, or they can do
things of this size. So you can do kind of granular permissions, and then you can delegate out
the ability to do that to the vault managers. And I think that's one really interesting use
case that we're interested in for DeFi, just because as the trading systems become much more
complex than traditional constant product groups, you know, X, Y equals K, once we get beyond X,
Y equals K, there becomes a little bit more of an interesting dynamic on how you allow people to provide
liquidity. And I think that account abstraction is going to be crucial to this entire process.
Yeah, I think that's a great point. I think it also plays into the AI
drive and, you know, respecting that they've gone through anything too crazy and blow up your positions.
Yeah, agreed.
I mean, with AI agents, I think the tricky part is just that you give them a private key
and expect them not to get jailbroken.
Maybe we just need a lot more hacks over the next few years,
and then that'll get kind of figured out.
But I think at this point, anyone who's giving an AI agent access
to more than maybe like 100 bucks for fun is definitely playing with fire.
Yeah. All right. So let's get kickstarted with the questions.
I'm getting poked from behind about going up to a little bit.
I appreciate everybody's patience with me.
So I'm going to throw it over to FlipBall first because you've been pretty quiet it'd be great if you could tell us more about how move uniquely enables your mission specifically a social live barter trading platform for any
digital asset on aptos yeah yeah i think that there's obviously several ways i think one of
the key things with what we're doing is because you have this real-time trading it's not like
open see your magic eden where you're just making simple bids on NFTs.
The throughput is, like, high throughput is crucial
as well as low transaction fees
so that people can be constantly, like, making trade offers
because you kind of got to go back and forth to reach consensus.
And so you need everything to happen quickly.
If you're doing this, like, on ETH,
obviously, you're, like, it's pretty hard to have this real-time trading,
especially because the way that our product functions is even though you're in this real-time trading room,
one person has to make the first move and confirm the contract, essentially readying up from the user standpoint.
essentially readying up from the user standpoint,
that sends the contract to the other user
who then also will then ready up
or modify the trade or cancel.
So high throughput, low transaction fees is very crucial.
And then the other thing that is unique to move
is that NFTs natively having that dynamic,
like having that built in.
And I think it's pretty cool the way that can be integrated with FlipVault,
where NFTs can potentially change as they're traded.
We were thinking actually having our own NFTs.
We're trying to build this out right now where our FlipVault logo could
potentially change colors, which is pretty simple.
But every time you trade it, it can kind of make it a little bit more fun to just trade in general and
with low fees i mean that shouldn't be an issue so awesome awesome you guys gonna have a fd
marketplace part of your uh project yeah yeah we'll still have an nft marketplace but right now
you won't be like make the goods or anything like. You'll just be able to see which user owns that given NFT.
Cool. So one of the projects that the team has kicked off that might be useful for you folks is that we're trying to make like a general purpose Aptos NFT marketplace, you can see it in move examples. A lot of groups have copied and pasted it
into their own space, maybe made some modification,
but fundamentally it makes it so it's really hard
to build a general purpose aggregate
across all NFT marketplaces.
And so our team that built out the build product
is kind of expanding into the horizon of trying to make it
so that everybody can access
to all the different NFT marketplaces
and also seamlessly build their own
marketplaces inside of that space so i'd love to get you guys connected within that tech suite if
you want to be an early uh developer of that space yeah that'd be awesome hope to get connected after
this cool cool cool so continuing right along uh ever since david uh i actually saw Anto raise his hand when you were discussing about the AI stuff
with Alex.
And I wanted to hear Anto's opinion on the AI stuff as well as a little bit about his
dynamic script composition stuff.
How does it exactly blend in with Kana's DeFi operations?
Yeah, thanks. kana's d5 operations uh yeah thanks uh i was actually uh uh gonna say about the account
abstraction that uh we are gonna uh implement uh so before that i would like to uh about how
we are doing things right now like we currently use a delegate account uh so that delegate account
can only use for trading purposes only like the deposit and withdrawals can only be done from the users for that.
So that will still have a pop-up.
And with the delegate account that resides in the front end,
and that will place the orders that
will be used for placing orders and updating TPSL
and all those things.
So only the things that needs to be executed fast,, those things will be using delegate account.
And the important functions like deposit and withdrawals will still be done from the user's
wallet and that will still have pop up and that does not require to be fast as compared
to the order placing.
So with the current account abstraction, we don't want to give the as compared to the order placing. So with the current account abstraction,
like if we don't want to give the whole access
to the like whole users access to the delegate accounts.
And right now, like, like we, I think we can,
we are gonna wait until the permission signers
to enable this like only selected functions
that need that can be accessed
from the wallet. So I think that needs to be said here. And second thing is the script
composers. Like currently what we can do is like we are just building out the transactions
in the back end and sending it to the front end for the aggregator so we are just executing as a transaction but uh with the
dynamic script composer we can just build our uh phone aggregator engine in the client side itself
so the transaction building and like uh route building will happen in the front end itself and
and it will be executed as a script so i think uh that can be done and the next uh thing i the more
i want to talk about is okay that's a question have you actually got that or is this a a work
in progress no it is still in the work in progress and it is not in not being like implemented right
now so we are still in the research we're doing research and it is not implemented right now. So we are still in the research, we're doing research and it is not implemented right now.
Then once you get that done,
I'd love to hear a follow up on how complex,
how effective it is.
Because it's been something that's long believed
is feasible.
I think you might be one of the first groups
that actually go out and prove the reality
of our hypothesis.
So super stoked about that.
Yes, the transaction building like right
right the current process is uh like it is quite simple for us like uh but uh the road building on
all those caching mechanisms that uh that still uses uh like that's still being run in the back
end and bringing them to the client side will uh will help uh us to reduce some costs as I guess so that that thing we
are still in the researching and the thing is the more I want to talk about things excited
about the dynamic script composer is there like we can just add any functions into the
script so like I can just add a function to transfer some USD to the P-Payer and just pay the transaction fee.
And that will enable us to do a GAS's transaction.
Like if you take this into the perpetual platform, like we only want the users to deposit USDC.
So they won't have any app rows. So in that case, like we will just use the user's USDC and make the deposit and order
placing functions with the USDC.
And we will be paying the transaction fee with the app rows, but we will collect those
fee in the USDV.
So the user does not have to worry about having a native token and getting app rows.
However, they don't have to do that. And they can just simply deposit USDC into the platform
and they can just start trading.
So that is what I'm looking forward to with the Dynamics
Composers.
I think that's a very smart use case
that you have found over there, Anto,
to just have, you know, just bundle the transaction
for the with P pair, allow ULTC
to be withdrawn, and then actually have it actually pay out the transaction in APT for
gas. I think that's a brilliant use case. And I also love the little bit of details that you
added on to David's prior point of how you're implementing account abstraction
while managing risk here. David, I actually want to take a moment and talk a little bit about the
efficient implementations of the new data structures that we have. And we have Moon over
here as well. So David, we want to just take this away with Moon and maybe just talk a little bit around
our new ordered maps that we have in place and how that can actually make this entire
process way more efficient, especially for perpetual trading.
I think you did a great job there, Sneha.
I think the only thing I'll throw in there is a little bit of background. So we first started out with tables. The tables allowed for people to create arbitrary large data sets in the chain. I think Alex can give a long history about his exploration of data structures dating back to May 2022 during our, I that era or the only era and
Recognizes that we could do much more complex data sets or structures much like Alex had been doing
So we brought in smart tables and smart vectors
Which effectively allowed for putting a lot of data into a single
Field or value with inside of the table and that allowed for better uses of storage, reduce overall fees in doing this.
But one of the issues is Smart Table, despite the name,
was actually really dumb, that it
could allow for an application using this
to be a victim of a denial of service attack.
So imagine you're happily using your Smart Table.
You're building this really cool DeFi application.
It's got all these different positions, and somebody just starts inserting a bunch of data into it.
And now all this assets that you've stored inside of your smart table are no longer accessible
because you're hitting this abort error because somebody figured out it cleverly attacked
that data structure, which is a feasible possibility.
Fortunately, nobody ever did that.
But we recognized the smart tables weren't that
smart. We stopped promoting them as much, actually gave caution in how developers were using it,
and we recognized we could still be more performant by not only updating the way in
which we interacted with the data structures, but also bringing forward behind the scenes
move operations. And what I mean this is by storage level move operations that bypass going through
the move like go directly to the underlying VM to actually move data around so if you come from like
the C background or assembly background it's actually these underlying operations that allow
you to efficiently move memory around really really quickly so when you're talking about this
large vector rather than like trying to grab bytes and move these bytes to
like, sorry, let's say instead of grabbing a specific resource or
struct and trying to move it around, copy, pasting all this
stuff, you're literally just saying, I don't care what's
there, just move over this chunk of memory from this location to
that much more efficient, very fast. And by going down that
path, we've been able to build these really amazing data
structures that are super fast. Now, with that that being said i want to throw it over to both
moon and alex to kind of just chime in on what this means for their applications in their development
process yeah yeah great overview uh david um i guess as a developer some nice things about it
are um like you said there's some new syntax syntax. Like, I come from the C++ world, where we're very used to having iterators to, like, iterate over data
structures efficiently, with, like, efficient runtimes and nice syntax. And that's, I think,
kind of the nicest thing from the DevEx point of view that the big order map does is it makes it
very easy to iterate in order. And when you're thinking about designing things like an order book or
perpetuals, you often have to order things like on an order book,
you have to represent price time priority.
So the orders that get placed earlier get filled faster.
And that's just a perfect example of where you'd want a large ordered map.
So you can iterate from the first order you want to fill to the last order you
want to fill.
And so it's just a very useful primitive with like very nice syntax. And
what it means for projects is previously, if you had these large numbers of ordered things,
you'd have to either index it off chain and like maintain that ordered mapping in an off chain
data structure, or you'd have to to limit the amount of orders you can have
because you don't want to denial a service attack.
And so a big ordered map allows you to scale,
let's say price time priority on an order book,
to any number of orders.
So I'm curious what Alex is thinking on that side of things.
But it's just a really nice DevEx improvement for everybody.
Yeah, thanks for the tea up there, Julian and David, for helping push a lot of this along in the background. We started out at Acaneo Labs with the gas schedule in 2022,
which has since evolved. And we were using the table, like David mentioned. We already had
started using this idea of saving storage slots
because it's always more expensive
to open up a new storage slot than to reuse an old one.
So we sort of did something
that predated the current storage slots allocator paradigm
by about a year and a half.
Technically we were pushing slots onto a stack
and then popping them off in the AVL queue,
which is what powers the Akane order book right now. But there's even more efficiency gains that are possible now that we can include even more
data in each storage slot and use generally concurrent inserts. So this is sort of what the
B plus tree that currently exists, the big order map does. One of the tricky parts is that if you
keep inserting the highest value,
it ends up being serial
because there's like this indexing issue
where the right-hand side of the tree
always kind of has to overwrite.
So you actually have to kind of invert things
when you're doing like the best bid basically
and certain things that come into order books.
Another tricky thing where the B plus tree
that's in the Aptos libraries doesn't directly carry over is if you want to have parallelized like order IDs, for example,
because there are multiple things at the same price level, but then they have a sequence number
within a price level. So we already been solving all these problems internally in some of our
designs and we'll be creating more custom beta structures. But I'm just glad that there's
a lot of this lower level memory management that's now available that increases the number of tools in our tool belt
because i come from a c background um and also you know assembly language as well so lots of
low-level memory management and i'm just glad that we can finally you know have something like a
pointer even though i guess move is supposed to not really give us pointers um but yeah i guess
at this point i'm kind of digressing into technical jargon which maybe maybe you're here for that. But I mean, the main takeaway is that
a lot of the new things that are coming down the pipeline, just make it easier for us to build
things that are even more gas efficient than we have in the past. Like right now, placing a limit
order and like posting it to the order book on a Kania already costs like two hundredths of a cent.
So getting it down to less than that, maybe even one thousandth of a cent to the point where it's
essentially negligible, you know, especially if you're trading at size.
Your gas costs don't really matter when you compare about, you know, how much you might be making when you're quoting the spread.
So excited that we can take a lot of this into high performance DeFi and I think solve a lot of problems that you can't really solve until you just have generally, you know, very high throughput.
you just have generally very high throughput.
So exciting to think about all the different things that can be settling on here.
Not just crypto assets, but a lot of things that already trade at multiple hundred billion,
sometimes even trillion dollar volumes in the real world, like Forex or Treasury markets,
which are extremely liquid.
There's definitely a path forward for those kinds of things to be settling on, Shane,
on Acania Labs built infrastructure.
Awesome, awesome.
I do want to take a moment and switch gears a little bit here.
I actually have everstaked a chance to talk. I think there's a couple areas that we talked briefly about beforehand.
Talk a bit more about using permission signers or kind of distraction or mixture of the two.
make sure the two to dig in what that means for you folks and your staking operations.
It'd be great to dig in what that means for you folks in your staking operations.
Yeah, sure. As a validator, we always think about staking and how we can improve the user experience.
And I'd like to add a few words about using permission signers that can help automate staking processes.
So in general, permission signers allow users to delegate specific permissions without exposing
full control over their accounts.
And when it comes to staking, it can be really helpful, useful.
And I'd like to discuss maybe a few use cases.
For example, redelegation.
So currently to redelegate, a user needs to first perform an unlock transaction and then
a withdrawal transaction after the end of the lockup period.
And only after completing those two steps, a user can stake with the new validator.
So if user don't withdraw immediately, meaning after the end of the lockup period,
they stop earning staking rewards until the transaction is completed.
And with the permission signers feature, the staking process with the chosen validator can be automated.
And, for example, transactions can be done instantly once the lockup period ends,
making sure that there is no wasted time and users, for example, don't lose some staking rewards.
example, don't lose some staking rewards. And of course, the rights to do such transactions can be
shared with Signer and used in developed algorithms. And I can also imagine it enabling
automatic stake movements and distribution between selected validators to optimize rewards and decentralization and this could be based on validator performance and allowing
the system to react instantly in case if a validator or not isn't performing well
you know so I think these are some examples of how the permission signers
feature can be used and implemented in terms of taking.
Awesome. Thanks, Anne, for that overview on how you will be implementing
account abstraction and your beyond permission signers
for taking account management.
So I think David, I think we can just maybe go around
and ask if any of our ecosystem teams have any questions
around the move to features, given that you are here,
I think it would be pretty good to have any questions answered.
Okay, I think maybe David, okay, David is drugged.
All right, okay, Alex, please go ahead.
Yeah, my question is about the prover
and if it's gonna be continuously maintained
to work with some of the things like type info,
for example, that cause issues
and where this plays generally into the APDOS tech stack,
the move prover moving forward.
I think David has been drugged still. So I would love if anyone else wants to chime in to
Alex's question. Till we wait for David to be back. I don't know how we are still getting dragged on twitter spaces it's 2025 elon is here many
things are happening but it's my fault i was dming david and i i rugged him i'll take i'll
take all the blame for it he's coming back david is going to be back very very soon
he's coming back david is going to be back very very soon
but uh if moon you would like to uh just add to what alex maybe you talked about
uh you know i'm also a big fan of the prover but uh like alex you know um we can all see how it
could be better more powerful you can prove more things So yeah, I'm curious to hear his answer.
The provers, you know, I think the best thing about Aptos move,
it's like one of the most powerful things.
It's like a big differentiator, but there's just a lot,
a lot of improvement to make it even better.
Okay. a lot of improvement to make it even better. OK.
David is in packs still.
Do we have Brian here?
He's a listener.
David's a listener if you invite him up.
Yeah, I don't probably have the access to do that.
No. have the access to do that.
All right.
So that is maybe one or two minutes of awkwardness over this chat.
But maybe, Antu, would you like to chime in,
talk a little bit about how your experience has been with the move through?
Uh, like I have, I have not started, like I have already started working on the move
through, but I, I did not complete my, uh, completely through our contract state.
I am still adding more tests and I have currently around the coverage is around 75 or so.
Like I still have to cover the code base carefully and after that I will be working on the proving stuff.
So I still want to know more about the prover and one thing I can say is it is really hard to learn the most specific language.
And I'm still a beginner to that.
So I just like to hear from everyone else.
Yeah, I think a lot of projects are in the same spot where you get started and then it becomes a lot of work and you go back to unit testing.
So I think the goal should be the Prover is easy to use as unit tests or even easier maybe.
But I mean, we're not there yet. So we're in the same spot. We haven't proved everything either.
Moon, maybe a question to you would be, I mean, this might not be a very, it might be a very basic question, but I think this would be pretty helpful till David jumps back on.
So in terms of the unit testing, how do you think, like, why should you be using the move through it?
Like, what is the benefits of using this, especially in terms of like a DeFi project?
How safe can this make your project?
Yeah, that's actually a great question.
I think the best way to think about it is with the unit test,
you're typically testing one input and you're checking to make
sure it matches an expected output.
You could check, well, if I do five plus five,
the unit test would check to make sure the output is 10.
With a prover, instead of just proving one input, it proves like some sort of equation or some sort of statement for all inputs.
So you could say, you know, this equation, this function takes in A and B and it adds them.
And you could write a statement that's like every single time this function runs for any, you know, value A and any value B, the output will be the addition of those two.
And where the prover gets a little more complicated is in Aptos move, you know, if both numbers
are too large, it'll overflow.
And so in some cases, actually, that addition will fail.
And so in the prover, what makes it kind of complicated is you have to consider every
single failure case.
So you'll have to also write, well, if the result of both these numbers is too large,
then it'll revert.
And so it really makes you consider
every single possible input
and every single possible failure case.
And if the prover spits out your proof is correct,
it's well-specified,
then you can be assured that in 100% of cases, it'll always be true.
So if you check, let's say the signer has to be an admin and you write a statement that's checking,
that proves that every time this function runs, if you're not the admin, it'll fail.
That sort of security is super, super important because you can't write a unit test for every
possible signer. There's, there's there's infinite.
So instead, the prover will actually do that for you.
And so if you could manage to prove out a file, it's just a kind of higher level
of security than you'd ever get through unit testing.
It's like almost like a mathematical function, which is like X going to FX
and it's covering the entire range.
So you are like completely sure.
It's funny because in the backend,
the way a prover actually works is math.
It's an algebraic solver.
So it actually transforms your move code into algebra.
And then it uses like a mathematical standardized algebraic solver to
prove that the algebra always holds true.
So it's actually exactly what you said.
It is actually math behind the scenes.
The move prover is great if you know how to specify what properties that you want to prove.
But the tricky part is that you have to know what properties you want to prove.
And the language that it uses consists of, you know, invariants and axioms that are not as easy to digest as
unit tests are for most software engineers. But I think the main advantage of it is it's basically
just a different way of thinking about your code and a different way of testing it. And generally
more testing is going to result in more secure systems. And I think that as we've seen with
projects like Aave doing formal verification on their code bases, you know, sometimes after the
fact, it can help give that increased assurance.
But in terms of the traditional
software engineering lifecycle,
it's still, I think, a toss up on whether it is necessary
for Teams simply because of the additional lift,
but I definitely would like to see it get to a place
where it's not as much as a lift.
And I do have to go for something at 15 minutes past the hour.
So I just wanna thank Aptos Labs for hosting this
and everyone for having such thoughtful,
engaging discussion and I'll be around.
David, welcome back.
I don't know what the question here towards me was,
but at least I'd like to kind of throw a quick call out there.
So we actually have the entire
or the majority of the Aptos framework proven out.
As a result thereof
I actually got to do a bit of exploration in that space and I can say at least from my experience
the one area that the prover was particularly valuable was in that very checking so I implemented
the move code I was really happy with it I went along doing the proof and along the way as I was
doing the actual proving tests it turned out that my
proof was much more comprehensive than what I had actually implemented and therefore there were
actually potentially bad pathways that were implemented in the original move code. Now the
reality is those bad paths probably would have never if ever gotten executed but the prover did
help me as a result write much more robust application code.
And so I do think when you think about DeFi application, it can be particularly useful if you want to make sure, as Alex alluded to,
that you really get all possible pathways to be really, really rigid and strict to make sure you don't have these kind of like safety violations.
of these kind of like safety violations.
The other big area that we thought was really prudent
was in this case of how we do the underlying governance
and the transition between different blocks
and epochs inside the blockchain.
So that's effectively at the end of each series
of our block of transactions,
we're gonna switch to a new state.
And after two hours, we switch to a bigger state.
And so that's the difference between
a block and an epoch you know when we do these transitions it's almost all move code that's being
executed and we want to make sure that there is no way in which that those could abort in a way
that's not predictable or believed to be possible because if it does abort then effectively our
blockchain is now uh going to be stalled and so by employing the prover, we're actually able to verify that there
is no way in which that we'll end up in a bad state based upon the logic that's implemented
inside of the move code. So the prover has actually been particularly useful in that regards,
but as Alex stated, it is still a big area of exploration and research.
So it's it's it's a lot of challenge if you're building out an application, it can add a
lot of lead time into that space.
And so you really want to use it with the right frame of mind of what type of application
you're building, but your risk of how it is and whether or not that impacts your time
to market in a substantial way.
So it's not like it's a silver bullet that you say you must absolutely use Approver.
I think it's really about using it for the right applications at the right time.
I think Circle, USDC also went down that path of exploring and using it for their applications.
And I think the most important, like you said, alluded to, the most important code to prove
is the code everybody uses, you know, a stable coin, Aptos framework, you know, our financial products as well. But,
you know, the fact Aptos is proving it out, that's the most important piece because everybody
relies on Aptos framework. Everyone needs the next block to be produced.
Absolutely, absolutely. But with that, I think we're 15 minutes over and I think we want to
release 15 minutes of Q&A. So I don't have access to the post, so I'm going to let the Aptos handle,
bring up anybody that wants to go ahead and ask any of our panelists or us any questions
about whatever is on your mind, but particularly on the topic of move, move two,
and this move madness that we're in. Before anybody came up,
can I ask something, David?
Absolutely.
Well, maybe it's a weird question.
Is there any chance that we can use cursor ide
so there's that's a great question so internally i'd say uh 30 to 50 of our engineers are actively
either exploring or or using cursor i think a handful of folks are trying to figure out how they
get cursor to be a bit more about Move.
And then for those that don't know, Cursor is an ID.
It's an AI-based ID, and it's based upon VS Code.
And so the hope is that with a bit more effort on the VS Code plugin, it would actually
be particularly useful.
The Move VS Code plugin would be useful within Cursor.
So that's an active area
we've got a gentleman maxim who's actively working on making that better and so yeah the hope is that
in the near future uh you'll be able to generate your ai i guess i wouldn't go to the extreme of
saying bi coding move but we'll definitely make ai move play much much better in effectively each other. David, I also want to use this opportunity
to make like a small plug.
I'm also working on MCP with Claude
and I've been feeding it a lot of move smart contracts
and trying out the outputs.
I will maybe share it later,
but it's been actually giving out pretty good outputs
in terms of our code the only thing that
uh generally is something that we will have to look out for is maybe using like version one of
like our tssdk or something else and then i guess we are getting like the like the outputs are using
v1 or something so yeah probably we'll have to work on that bit.
But yeah, the MCP with Claude
can actually give you very, very good smart contracts.
Along with that, any sort of front end as well
using the Aptos Wallet Adapter, Aptos TSSDK,
I think it's giving pretty good outputs.
So you can see Sneha has a real developer
that I just posed as one.
I would love to have any other questions.
I think Shamrock might be having another question here.
Yo, what up everyone?
I hope you can hear me because spaces were weird for past weeks.
Yes, we can.
Go ahead, Shamrock.
Yeah, let's go.
Okay, so legends on the stage, legends in the audience.
And, David, I missed you so much.
Shamrock, we see each other every day in Supervillains, but I've missed your voice.
That's different, bro.
That's different. I can't hear your voice there you know that's the difference and your passion and yeah
actually uh people who know me uh on aptos knows that i'm totally not technical guy and where when
i'm listening to all those technical spaces you all sound like wizards, like magicians to me,
but I will try to drop my question on topic.
So basically you have move language
and it is still not fully investigated.
I mean, you still find something new almost every day,
I believe, because when I listen to these spaces, I can hear here and there that people are finding some functions. So actually, what you feel when you find something new in the language, in the language which, you know, which exists, but you don't have the whole potential of it.
but you don't have the whole potential of it it's a question to everyone even to you
david or whoever want to jump in i'm curious
i kind of want one of our speakers to jump in and share their their thoughts in
frame of view otherwise uh i'll stay high go first and I'll chime in towards the end.
And awkward silence.
I am sorry.
I actually missed maybe one minute or so from the spaces.
Do you mind just talking about the question again?
But I just have to say, Shamrock, your energy is off the charts.
Well, like, basically, my question is what you feel,
like all the builders on Move,
like what you feel when you're building on Move language
and you know that you don't really know the full potential of
it and if you find new features like almost every day oh yeah in time and just reflect on
well when we were building out the original nfts and we're like this is the de facto way that we
need to represent nTs in Move.
And it doesn't make sense to do it any other way.
The problem was accessibility.
We actually had, all the NFT projects came to Aptos
in 2023, January for NFT events.
And for Brabears said that they lost
one of their Brabears on chain.
And it was like, wait, that doesn't make sense.
Like, move doesn't allow for that to happen.
And it just turned out because of the way the NFT standard was at that point in time,
it was really easy to put your NFT into a random place and never be able to figure out where that was.
It's the same truth for the coin standard.
And so, like, that's a really powerful primitive to be able to do that kind of wild stuff and lose your data, but
it's still not be really lost. And so I think that's kind of
like the beauty of the bouvling, which is that you can kind of
construct all these wild and things that you have to like
say, well, what is the problem that I'm trying to solve and
move towards? And what you know when you're building these
underlying primitives
how do you actually think about what somebody could do that might be wild and unpredictable
and is it something you try to control is it something you try to fully understand or
do you just kind of go on a whim and say I hope this doesn't blow up on us and I'd say
we've kind of blended all three of those together on every step of the way.
You know, when we built out objects, we made objects look a lot more like accounts.
In retrospect, we probably should have been a lot more restrictive because now we're already talking about storage-free accounts.
We're getting rid of quids.
There's a whole bunch of other stuff that's happening behind the scenes that say, well, you know, if we had thought about this a little bit more deeply, maybe we'd have something different. But at the same time, by moving quickly, we've created
opportunities for more iterations, more evolution, and more creativity in the space. And so I don't
really know how that really answers your question other than to say, I think it opens up a lot of
doors to folks just kind of building in a way that maps to what makes sense to them but
perhaps the risk of move not being extremely opinionated uh and saying this is the only way
to do something okay so to follow up on this uh do you believe there is actually there any ending
point uh when uh move is fully like investigated and learned or we can expect
some, you know, after move two, we can expect move three, move 23, move 100 or something.
I think it goes in two directions.
I think one is what is the future and evolution of the language?
And I mean, while we still haven't gotten the AI fully concretized and seeing what the gaps are and what we can make that to be done more easily, there's still substantial things in terms of like traits and dynamic dispatch that are still on the forefront but i think it goes back to even to date there is not a book that says
canonical move or you know if you're a c plus plus person you might remember effective c plus plus
but even the books that were written back you know 10 20 years ago which i have on myself right next
to me they've largely been superseded by what c plus plus whatever 10.0 or whatever it is currently that's running 2020.
I don't know.
And so like languages continuously evolve.
The right way to write the language also evolves.
And I would say at this point in time, nobody can say this is the canonical way to develop
a move application.
And so every time I see a new move application, it just is so different and explores ideas
that help me as a developer actually say, it just is so different and explores ideas that
help me as a developer actually say, here's new ways in which I can educate other people
to be great developers.
Shamrock, my friend.
Okay, go ahead.
I just want to maybe just as in, just since we are like in the final minutes of wrapping up this session
i just want to say because we have all of these top tier d5 teams here we're all building on move
and i truly truly believe that what i'm about to say is how it's going to be like a very bold
prediction but i truly believe that move is a smart contact language that is totally, totally different than what we
have currently in terms of Solidity, Viper or like generic programming languages like
And the capabilities that MOVE carries right now is insane.
And it can actually transform the way that we are dealing with blockchains because it
understands blockchains because it understands
blockchains at a core level it understands assets at a core level and that's exactly the reason why
more and more blockchains would start adopting move as a smart contact language and we have seen
this along already with like a couple of projects like initia, Movement, Adopting Move. And we will also go on and see the current bigger projects like Solana and a few other
mainstream projects as well, Adopting Move as a language because simply because of the
way it has been designed.
And as David said, it will continue to evolve throughout.
So yeah, like I think that is pretty much what uh we are all here for uh david any
ending ending notes from your end or from any news that would be good
giving a small pause let anybody else jump in i i just gotta say it's it's great to be with
all these amazing developers.
Each of you I've spent time with learning
about what ails you, work with you to make things better.
I think we're definitely in a greater space
than we were when we started,
when Alex, who's had to jump off,
started roughly, gosh, it's been three years
and it's where we are now.
And I'm really glad that you guys have been part of that journey.
And I really look forward to as we evolve Aptos, as we evolve Move,
that you folks stick around, that you continue to give us feedback about what the challenges are.
Because ultimately, what we're trying to target here is we want retail, we want users to have the best possible experience on Aptos.
And what that means is when
we build out Aptos, when we build out Move, it's with that in mind. And the way we do that is by
taking your feedback and taking the feedback that you get from your customers and your partners
and using that so that we're all working effectively together. And I think that's what's
made Aptos really great. And that's what I tell everybody when they talk to me. What makes Aptos, what's a differentiator for Aptos?
It's that we've got this amazing community, whether it's Shamrock, who's coming in from the community, from the retail space, the consumer space, and sharing what excites him about Aptos and that energy.
Or it's folks like Mirage, Anto, Alex, Patrick, all sharing the great stuff that they're doing
to make Chairmark's life on Patos amazing.
And so I just really want to say thank you to all of you
for coming here, for sharing your insights,
for being part of this journey.
And I wish everybody a wonderful and fantastic day.
Let's move.
Let's move. Let's move.
Before I get the space, I'm just going to throw out a quick plug for Mirage.
Keep an eye out for notifications.
We're going to be live in a few weeks, and you're going to want to use us.
So stay tuned.
Thanks for inviting me, guys.
Take care. thanks for inviting me guys take care thanks everyone for joining in
yes thank you
for inviting it was a pleasure thank you
thanks for having me and stay
tuned like I said our test net will be up in
a little under two weeks
superb looking forward Stay tuned. Like I said, our testnet will be up in a longer two weeks.
Looking forward.
GN, GN, folks.
Thanks for having me, guys.