Music Oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, Thank you. Oh Thank you. Oh I'm going to go to the next episode. Thank you. Thank you. Thank you. Thank you. oh hey is my audio can you hear me we did we got you sweet
welcome welcome one and all to the enoma ama
thank you thank you for joining Welcome, welcome, one and all, to the Enoma AMA.
Thank you, thank you for joining.
That's quite a lot of microphone feedback there. Do I sound okay, or how am I sounding?
Excellent, excellent. How are we doing? Doing good?
Doing great. Happy Friday. Yeah.
A lot of microphone feedback
Yeah, it's getting quite a bit of feedback, but I don't know if that's just me. Oh, you're getting a lot of feedback.
There's a lot of background noise.
Oh, Mike, how are you doing?
Sound beautiful. Nice nice i'm in
yeah this is not digital they still didn't fix the distributed systems problem at twitter
it's ridiculous like to me mike is still a listener it's interesting oh yeah yeah yeah look whenever I like host these spaces
right whenever I do spaces
with just friends like it's so bad
like unless you're in from the start
like people that join on stage just show as a listener
and you just have to get who's speaking half the time
but they fired like 80% of their
engineering staff so they're doing the best they can
I mean we're all on 80% of the engineering staff, so they're doing the best they can. What can you expect?
Modest success, actually.
Honestly, from Chris's here,
half the time, half the people don't hear.
Like, this is a success. We have four live speakers.
Ah, the buff excellent has gone down
so we kick it off we should kick off you can go
sorry i'm not going to hear any background noise from me then? No.
I sound crisp, crisp and clear.
Oh, I keep muting myself.
Should we do a quick round of intros?
I think Adrian should go first.
I like software and stuff.
I like Anomo. I like intense. I like Anoma.
I'm Devro, Anoma, Chimpfone.
So, yeah, just happy to be here.
So, it's my first place to be here.
Go on, who's next go mike you go
yeah uh mike i do bd and growth at anoma um done a lot of stuff in the crypto space
built infra built apps excited about building apps on anoma excited about building apps on Enoma, excited about DeFi and on-chain activity,
and excited about jamming with builders like Repo
who are seeing the light on Enoma.
Awesome. I can go as well. I'm RG. I'm a core contributor at Repo. And we've been building on Enoma since, I guess, four or six months now. And I'm super excited to share some of
the progress we've made and some of the exciting things that are in front of us. Looks like we've got Harsh on too.
Yeah, I can go next. Hey everyone, I'm Harsh. I work with Repo as a core contributor too.
Love everything Solveign, love everything Web3, helping build cool AI data applications using Anoma as well.
Would it be good to sort of go into like what we're doing?
Maybe that would be a good starting point if you guys want to just talk about that.
Sort of introduce your project, maybe an elevator pitch, I guess, something like that.
Yeah, totally, we can do that. So at the end, at the core of it, you know,
a repo would build building is a resource coordination layer for AI. And what we mean by
that is, right now, we see a lot of AI launch pads, agent launch pads, and places to build
We're more focused on coordinating access to resources,
specifically data, infra, and capital.
And so specifically, Enoma comes in where we want builders and even AI agents, physical AI systems,
to be able to discover, negotiate, commit, and settle
to specialized data and specialized infra on
demand. And we'll dive into how we've designed a novel solver node design based on the Anoma
resource machine and how we're enabling it. So I'll let Harsh talk a bit more about the
solver nodes because he's been the main architect behind the solver nodes.
I guess that was a segue for me to speak.
Yeah. As Argy was mentioning,
what we're trying to do with Repo is resource matching.
In the sense, the first target that we have is data requests.
So we want to create a system where any entity,
human or AI can in any live workflow,
create a request or rather an intent for a data.
So I am an AI agent doing some DeFi trade and wants
to make certain analysis based on this trade and I need this data.
This might not be public data,
this could be how many shoes do get disposed in
Vancouver in the year of 2024 of this brand,
This might not readily be available on a public data hub,
it might not be necessarily aggregated,
there's a bunch of data transformations that need to be done.
That's where we really want to leverage the ethos of
Web3 to provide sovereign data to use
decentralization to aggregate this data.
Then comes what Argy was mentioning, the solver nodes.
They act as this aggregator layer of decentralized entities, right?
So they broadcast these intents and constraints to the network. We have our solvers that go out hunting for this data and meeting these requirements and constraints
and sort of committing it back to the request.
And then we have very happy users once their data is fulfilled, ideally.
very happy users once their data is fulfilled, ideally.
So, okay, so kind of when you sat down and did this project,
I guess, kind of what was the sort of main problem
So it evolved with time. I think initially we wanted to first focus on getting more agents onto Web3 to fulfill these kind of requests.
It is the classic demand supply thing that any resource matching company or project will run into, right?
source matching company or project we run into, right?
But as we segued into what we realized,
the beauty that Anoma leveraged or gave us leverage for
was this intent-based architecture,
that we could have these intent and constraint-based architecture
that we could broadcast to the world,
that this is what we're looking for,
and let the world of agentic AI as it was evolving,
I think it was perfectly timed with the AI sort of LLM boom too,
to process a lot of that intent in natural language.
The intent, just how I describe you, right?
Like how many, what's the efficacy of a prescription drug
on golden retrievers and initiated by vets in the US, like perfectly sane constraint to ask in a statement that can
be processed by natural language into schemas, into data schemas that can be then validated
and confirmed that that meets
certain requirements and constraints.
So I think it was perfectly timed with
us trying to create a resource matching Web3 engine,
AI getting to a point where it could
process natural language into deterministic schemas,
and using the intent-based architecture
of constraints and intents to act as this matchmaker.
Okay, got you. So, kind of Anoma allowed you to bring what you envisioned to crypto and Web3,
whereas existing chains probably didn't quite fulfill that. Is that right?
Are you saying that would be intense?
Yeah, I don't think there was an existing Web3 project that is quite based on the
solver node architecture that we have. So what we found, for example, data DAOs is the
closest thing, but data DAOs existed for a long time,
but data DAOs weren't able to necessarily capitalize on
a demand and usability and
consumability side of things.
So the way if I was to map out the flow of a user,
it would be from user goes to the platform,
from platform the requests and intents
and constraints are broadcasted to the blockchain network.
a decentralized fleet of solver nodes triage these requests,
and then they themselves configure what we call data nodes.
And data nodes could be a data DAO,
it could be some data decentralized solution,
your AWS S3 bucket for all we care.
It could be private enterprise data.
So that's the configuration that
solvers can configure to stand out
and be a more active entity in the network
based on the edge they have
by becoming these data providers.
there might exist these data DAO kind of solutions,
these data L1s here and there,
but the big challenge was, one was discoverability and the second
So sure, you've created a data DAO of medicine being administered somewhere.
Now if I'm an AI agent mid-flight trying to make a trade on some medical company, how
do I know that that data exists in your data DAO?
Now I want to purchase it, how do I know that that data exists in your data DAO? Now I want to purchase it, how do I consume it?
You have a governance DAO that I have to negotiate with now.
There's some weird, it becomes less and less permissionless as you
keep on going down that rabbit hole,
which completely negates the ethos of Web3.
We want it to be as self-sovereign,
as permissionless, as agentic is the new terminology,
but as free and accessible to consume and discover.
That's the mission that we were trying to solve,
which other existing chains perhaps weren't achieving.
I'll let Arjun chime in here too.
I think another thing I want to add here is, you know, in addition to the philosophical
aspects of, you know, decentralization, permissionless, I think one thing that we, and Adrian talks
about this a lot, is what's happening with a lot of, like, ecosystems that are trying
to build data layers or trying to solve the data problem everyone's you know have just
because of like incentive structures and VCs everyone has a very strong
incentive to build an L1 so even when you look at data DAOs or you know other
L1 data for AI everyone has a strong incentive to build or capture the value
ecosystem, right? So that means that, let's say you've created data at house. Now you want to
make sure that, you know, the data exchange is within your own ecosystem. Everything is within
kind of intra ecosystem instead of like cross ecosystem. And the reality of, you know, the
reality of the world is that the
consumer and in this case even AI agents they don't care just about the
you know the manufacturer they don't care where the supply is right they want
something for their workflow and they just want to get through it so if you if
I'm like you know building an AI agent and I want specialized data for my
financial workflows let's call this alternative data like some weather data I don't want to spend the time you know
registering on a new kind of chain learning about their flow and I think
that's the beauty so when we extract that learning curve to the solver node
and by doing that we also are abstracting away the,
you know, each solver node can optimize
Yesterday, I'll let Harsh talk about this as well.
Yesterday we saw, you know, a lot of vulnerabilities
and security concerns coming around MCPs.
Now, if you're a data, if you're a data requester
and you want some data, you can just say that,
hey, I want to work with solvers
that have a really high reputation
when it comes to security,
because I'm working with sensitive data.
Maybe you're working with health data.
Versus if you're just working,
if you just want some synthetic data set,
maybe security isn't the biggest concern.
Maybe there's like quality or time is a bigger concern
so I think that's the main thing by abstracting this kind of job to be done
and the responsibility to the solver node network you know you could you can
basically 10x or even 20x the user experience for the user while creating really easy plug and play monetization mechanisms
both for you know data dows data layers but also individual data owners and data consumers so
that's how we're thinking about this oh yeah thank you that's uh yeah very interesting um so
what's happening with Repo now?
So I think I remember seeing something about a node sale or a solver sale.
Can you guys talk me through that?
So one of the biggest challenges to put this into implementation
into implementation is that we need to bootstrap our own solver network, right?
is that we need to bootstrap our own solver network, right?
And, you know, we've gotten great support from the Enoma team here to figure out ways
in which we can bootstrap our own solver network.
We've decided to explore the, you know, node sale approach where we think that, you know,
a node sale allows us to get people on board who have shown some level of commitment.
And then, you know, one of the big things about repo and solver nodes in general is, you know, it's less copy paste in the sense that, you know, it's not like, you know, you sell 10,000, 15,000 nodes and all the nodes are doing the same thing you know you have nodes which can solve like more complex intents and they can be nodes that are just like a simple script
that just call any like an opening higpt or sorry like a o1 api and just you know pull data you know
synthetic data sets so i think um that's really important for us and we actually had a milestone
of bootstrapping 400 nodes before we can launch the testnet.
And we're happy to share that, you know, we met that earlier this week.
So if you go to reposolvers.xyz, you can learn about solver nodes.
If you want to participate in one, you'd be happy to do that.
And then as part of this current, you know, AMA, we actually in the replies, we just shared a very small NFT collection.
So we're happy to reward everyone to show the support.
But yeah, I think our approach has very much been that you can, you know, participate in the SolverNode.
I think our approach has very much been that you can, you know,
participate in the solver nodes.
We partner with a project called Zoo Finance,
which enables day one liquidity.
So the goal is that, you know,
we're not trying to play any like long-term games in terms of just,
you know, keeping the community waiting.
As soon as the solver node, as soon as the data exchange is live,
you should be able to earn fees.
You should be able to earn, you know, rewards. earn you know rewards there might be you know this is these are not promises but there might be
you know the different data data layers we partner with they might have rewards and things like that
so we're very much thinking community first trying to embed decentralization from day one
because you know from what we've seen in the crypto space,
it's decentralizing down the line, although it sounds good, it's a lot harder.
So, what can people look forward to with Repo?
So you kind of talked about the node sale, what's going on now?
What are you guys looking at in the future?
Yeah, so the biggest thing, I think, is our upcoming testnet on June 21st,
where we'll start with, as I said, the initial set of nodes on Repo.exchange.
the initial set of nodes on Rapo.exchange.
And so what we're doing is we're going to market
with about six, at this stage,
about six data layers and data providers.
Some of them include Vana, cambrian.org.
We've actually been in touch with a Web2 company
called Ragent Corporation, which has a bunch of IoT data
that they're looking to monetize.
And then we're working with, we're in discussion with Open Ledger.
So you know, some of you might know that project.
And then just like two other smaller players who are kind of in
And so that's kind of the way this will look like is on June 21.
If you are a consumer, if you want some data,
you can come and just put an RFD
and let the solver do its job.
Most likely, we'll start with synthetic data sets
and then eventually move to more custom requests.
And then the goal is maybe like, you know,
do that for about eight to 10 weeks,
get the feedback and then launch
mainnet where, you know, I mean depending on how things go we can see a lot of volume
as the data becomes, you know, a bigger asset. So yeah, that's the next things.
Cool, thank you very much. We can go to some of the community questions that we have for
the repo team if that's okay with you guys. Is that all good?
Yeah, totally. I'd love that.
Cool. So this one's from, I'm going to try and pronounce these names because it's going
to be fun, I think. So from Boriska, in an environment where value is formed based on intentions and off-chain signals, how does Repo filter or validate data to avoid trust inflation? Are there any mechanisms for reputational consensus?
I love this question. Harish, do you want to take a stab and then I can jump in?
Harish, do you want to take a stab and then I can jump in?
Just to rephrase the question, I'm guessing you're asking about data quality, right?
And authenticity and those kind of questions.
We get those questions a lot. We have in our architecture,
so it's two layers, I'll rephrase.
We have two layers of defense and depth measures.
So the first is we will introduce
The way validators work is we're going to
So anytime someone fulfills request,
it would randomly get sampled and broadcasted to
this set of validator nodes and they
would basically notarize on
if you may, or the quality of this data,
and commit back to the network.
These validators will accrue repetition
on how accurate they are and will be incentivized.
This is where the crypto primitives
and crypto incentives come into place.
The last line of defense is the user itself has
the ability to finally accept or reject a solution,
and they would get a preview again of
a random set of data points,
not the entire solution of course,
and they can make the choice if that is the right solution
that they want to use and
employ into their workflows or not,
and if not, they can reject it.
And if they do, that's when the rewards get dispersed.
Yeah, and I think in addition to that,
one of the things we're also exploring is
so the different data nodes or the data providers
that will join the data exchange.
Right now, you know, we're wedding them, you know, through the Repo Foundation.
So we're more selective. But down the line, the way we're thinking about this is by limiting real estate, so to say, and maybe like having 16 to 32 spots for different data providers to kind of come and compete,
and that would require them to stake repo token, right?
So, and by essentially making that,
by taking that kind of like stake,
the data provider is kind of providing
some sort of guarantees about the data quality.
So I think that's another approach.
And so really it's a three-pronged approach.
It's on-chain reputation.
It's the validator network,
which is doing solver solution sampling,
if you want to call it that way.
And then you have kind of speaking
from the data provider side.
And that will basically give any data consumer
kind of you know yeah an optimistic rating or an optimistic and a
percentage that hey like this solver node is 87% likely to you know serve your
intent successfully and I think that's how we think about it. My experience, so, you know, I used to work at Filecoin and the challenge with data is
that I believe that you should never, you know, in a decentralized manner, you can never
guarantee data quality because, you know, I mean, the only way you can do it is, you
know, maybe there's some like, know insurance mechanisms when you when you can reward
someone if things didn't work out well but I think the optimistic approach is
the best approach where for the for the time being the consumer which again can
be AI agents and physically I have to take some level of risk. It's always a risk reward.
So yeah, that's how I think about it.
No, it's good to kind of think through the authenticity
So yeah, it's good to see you guys have put a lot of thought
OK, one last question for you guys.
This is from Will Moraes.
I think I said that right.
It's kind of a general one.
So what is Repo's long-term vision within the Enoma ecosystem?
right now the main thing we're building
But if you go to repo.xyz
you can you know you can find our white paper and how the repo like the data exchange and the
infra exchange which you can find on i believe it's just called infra.exchange actually
i think those are anomaoma is a core, core aspect of how, you know, this closed loop
economy works. So essentially, the world we're imagining, you know, in the medium to long term
is right now everyone's, you know, competing to be DL1 for, you know, data or DL1 for models or DL1 for decentralized compute, whatever that is, right?
We think that's cute, but, you know, in the medium to long term, the reality is that if
this industry is going to, like, you know, compete with Web2, we have to think user-centric.
And that means that, you know know the user should not care about
Which layer or which protocol their data but their compute
AKA the resources are coming from and to that extent
I know one becomes a critical critical aspect for the repo, you know design and the repo kind of exchanges
And I think one of the things that we haven't like you know necessarily like talked about a lot we don't think that you know these two exchanges are going to be the
the products right I think these are going to be go-to-market products and down the line what we
want is we want people to fork them and create their own exchanges you know like if you are in
a certain region or if let's say you want to create
a data exchange only for health data, you should do that, right? And what we want to do is we want
to give you this, you know, out of box playbook. Okay, if you were to create your own data exchange
for health data, well, you have to think about solver nodes, you have to think about, you know,
data quality, you have to think about all these things quality you have to think about all these things and so i think that's the medium to long-term goal the goal is not to lock people in into our
exchanges um the goal is to you know build and show like how this decentralized middleware
solution can actually improve the ux 10x 100x and then allow anyone to build smaller more specialized exchanges and marketplaces and
we really really believe that the future is you know marketplaces for specialized
access to data and infra so that's the medium to long-term vision
amazing thank you very much um yeah thank you very much for coming on to this Spaces and speaking.
I think we're going to move over now to some more Anoma team questions, if that's okay
with you guys. I'm going to throw out these initial ones. So this is from Asif.
How does a Noma position itself among L1s,
like Ethereum or the Cosmos?
Who wants to say that, Rolf?
Adrian, you want it or you want me to take it?
So how does the Noma position itself amongst other L1s, other L2s, let's generally call them base layers?
So I think when we think about it like that,
I want to like drive this point home pretty hard.
Anoma is not another base layer.
Like I think people want to put things in buckets in crypto
and they want to be like,
oh, it's a base layer or an application.
To be clear, we're differentiated, I think, from base layers.
And really the idea or concept of Enoma is born to
make base layers technology better for users, i.e. to deliver a better application environment to
allow base layers like Ethereum, like Cosmos Chains, like all the new ones that are coming now
or in the future, to do what they are good at, which is provide a settlement environment or express trust
assumptions. And Anoma is really meant to sit on top of those as an operating system or a functional
layer for application development. So I think really when we think about all these various L1s
that exist, I like to think of Anoma as like a value add, like they're good, this technology
has taken us to a certain point. And Anoma is really the unlock that I think that can make
all these various base layers speak to each other that can harmonize liquidity, harmonize the user
experience, and ultimately give us an experience that, you know, I hate this analogy but more web 2 in nature where um is user-centric
usable apps that scale that can do a whole variety of more expressive things uh using
enormous tech so that's how i see it i don't know if anybody wants to add
i agree i completely agree um and if you haven't already, it's only just dropped before this space,
but the first Anoma 101 just went out.
And it's pretty good for kind of doing a quick summary of what Anoma is.
So I'd recommend everyone checks that out if they haven't already.
Just doing a little plug of Moe and stuff.
Yeah, I actually think that's a great explainer.
So I would steer people to that.
I think the TLDR there is we've got in this vicious cycle of what I call the chain wars where one cycle it's Ethereum, the next it's Solana, then it's Monad, then it's Wonad, then it's Gonad, whatever.
it's wonad, then it's gonad, you know, whatever. And it's all
these different chains that come and app developers have to
take, you know, not a one way bet, but quite a static bet on
where they want to deploy and which chain is going to be
popular. And they really hang their hat on that. And I think
the beauty of Enoma is it gives you the flexibility to deploy
across chains, but also to become future-proof.
So essentially, as new chains deploy,
you can just have connectivity to those
as the Noma's deployed there alongside your app.
So your app has a lot more lasting power
than they have in the past.
Gonads is a great name for chain, though.
I shouldn't have said that.
I was riffing through letters, and I was like,
I'm going to say G, aren't I?
Right, we'll go to another question.
This is from Tiffany Blue.
What are some real-world problems that only only Anoma can solve, but other chains
So which current L1 or layer two would benefit most from integrating with Anoma?
That's actually more of a two-parter.
So yeah, so what are problems that only Anoma can solve and which current L1 or layer two
would benefit most from integrating with Anoma?
Yeah, I can quickly take this i mean so anoma is the only place that
has very that has actually spent any sort of time on making the distributed operating system work in
practice i.e like what people think of when they hear the idea of a world computer anoma is actually
a thing that does it uh right because it allows you to write an application once and then not really
care on which specific chain it runs like if a user wants let's take other for example it's kind
of weird that there are like 10 different deployments of aave and for a new ecosystem
they then have to go bravo and see like if the other team wants to also deploy to them
but it's like that there's not one Aave application.
It's just like 10 different Aave applications,
each prefixed by a specific chain.
With Anoma, this just becomes like
there's one Aave application,
just happens that maybe there are multiple chains under the hood,
maybe there are multiple different security models under the hood,
and this is all completely unfigurable
on the user and the developer side.
So I have a huge part where this really plays in security models on the hood and this is all completely unfigurable on the user and the developer side. So the huge
part where this really plays in is
especially with DEXs because you get to share
liquidity across, not like
your liquidity isn't isolated
to a specific chain, like not only
isolated to Ethereum, but it's rather
like you have a shared liquidity
is a huge, huge capital efficiency gain.
And I think maybe the last part is
you also just get optional privacy guarantees.
If you want to trade on DEX,
you compose the private trades
as well as the transparent trades.
It's essentially one liquidity pool together.
So I think there are very few applications which don't massively benefit
from anoma um it's like it's really simple like maybe something i can multi-stick right like that
like see some improvement but like not as much as dex does um for the question like which current l1 l2 would benefit the most from integrating with
anoma the i would actually say honestly pretty much every single one of them um it's it very
much depends because like it depends what flavor like what kind of specific secreting model you're into to answer this question.
Yeah, I think I agree, though.
There's so many sticky users on all of these Layer 1s and Layer 2s.
If you build on an Omer, you just have access to all of them.
To me, it just makes the most sense.
It feels like everyone benefits from an Omer,
no matter what chain you're on.
Yeah. Okay. It feels like everyone benefits from Anoma, no matter what chain you're on. So, yeah.
Let's go for a little bit more of a developer question here.
So if anyone wants to build an Anoma,
is learning Juvix, which is the language that Anoma apps are built in,
Or does Anoma support other languages like Rust or Solidity? I feel like, Adrian,
you might be best to answer that one.
Because in the end, Anoma is
essentially a number of large circuits.
So Anoma currently supports the
RISC-0 backends. It also supports the
And then also one transparent backend.
over time, you can use whatever language you want.
So for example, Rust or your favorite language here.
However, there are going to be some benefits
to writing applications in Juvix.
Most specifically, it allows you to be able to move between proof systems.
Because, for example, currently you can build an application in Rust, but in that case you can
only target the RISC-0 backend. Whereas Juvix as as a compiler because it's an integrated compiler stack
essentially you can write your program once your application once and then you can pick the specific
sort of back-end deployments that you'd like to shell out to because jubix just supports all
these different types of backends so i would really recommend writing application in Jubex. And I say this sort of, I know full well
that learning a new thing requires time.
But one of the huge advantages that Anoma
and the Anormal Resource Machine have
is that it has rethought sort of the state-layered architecture
to make all this distributed operating system stuff work.
Like, Anoma is not the place where you just got to go
and copy and paste the same sort
of 50 lines of CID code and tell yourself that you're a developer.
For that, there's like 500 different roll-ups you can pick from, essentially.
If you actually want to build novel things, Anoma is the place to be.
Yep, yep, I agree. Very true, sounds good. yep yep
OK, let's do an addition from AID.
What consensus mechanism does a node
be used, and how does it ensure security
in a distributed compute model?
Yes, this is actually a very good question.
If you're familiar with consensus mechanism.
Yeah, it's generally muted. Great. is actually a very good question. If you're familiar with consensus mechanism, great.
use a consensus mechanism
Typhon is a combination of heterogeneous
Paxos plus heterogeneous Normal.
So it essentially supports infinite TPS. and you can keep adding servers to it.
So it also already has execution charting in there.
And essentially, it's very adaptable.
So it supports some local deployments with OneNote.
Maybe you and your three friends want to play a video game in your local basement.
You can run the local consensus instance
just between these four computers.
It also supports massive scale decentralized deployments
And so in that sense, it's very adaptable
And the nice thing is with this controller system,
you just get a high degree of interoperability
between all these different instances.
In the end, when you think about it in which family does it lie,
it lies in the fast finality BFT algorithms.
So essentially like tenement.
And like tenement has single-stall finale and so on.
Oh, yeah, sorry. I couldn't hit the mute button.
That was a skill issue from there.
Have we talked about local consensus there?
Or have we touched on that?
I think I'll do maybe two more questions
and then we'll probably wrap it up.
is there any forecast for when the DevNet
will evolve into TestNet or MainNet?
There's some idiom in English, I think,
Keep your eyes peeled, thank you.
I was so confused for a second there, I didn't know what was going on, I'll be honest.
I've been told that I tend to make up my own idioms as I go along.
So yeah, it's now keep your eyes peeled um for the coming month um i think uh whether we'll upgrade this
into a public uh the definition into a public test and i don't know we'll see i think uh especially
around the timing i also think with mainnet um it's not that far out yeah i mean like yeah roughly
yeah i mean like yeah roughly we want to we want to deliver a main net this year i think is our
goal i think that's what we said before we want to have you know uh user users able to
play with enoma feel enoma experience enoma over the coming months hopefully
um and that will come come in different flavors.
And yeah, we'll keep everybody updated.
I don't know if we want to end with maybe just sort of,
I'll do one last general question.
Just sort of a very general overview.
I guess this kind of ties into the testnet bits anyway,
but is there any sort of general Anoma progress updates?
This is from Sam, by the way.
Any sort of, yeah, just general Anoma progress updates
that you need to know about?
Yeah, maybe I can take it.
So, like, lots of good things happening at Anoma.
I highlighted some of them, but just like building off of that, I think very important
stuff for Anoma. We now have 25 plus builders like Repo, high quality teams investigating
unique use cases with the Anoma architecture. So that's great as we start going you know through what we just
talked about the testnet devnet hopefully you guys will all get the ability to start playing with
these uh what else can what else can we tell you I think um yeah we we already touched on mainnet
I think you know people always want to talk about token and go to market and these kind of things. And this will, you know, we'll also share more over the coming months alongside the testnet. So that's going to be some exciting updates that I'm looking forward to as we start kind of solidifying this for the community, what this looks like from a go to market and how people can participate in the network.
a go-to-market and how people can participate in the network. Yeah and other than that have some
you know events scheduled this year. Love to see people there so I think you know we'll have people
Berlin Blockchain Week, ECC, DevConnect and and we'll be hosting events there for people that are in town.
Amazing. Thank you very much for the update. So, yeah, I think we'll probably wrap it up
now. Thank you very much to Repo and Harsh for speaking on today's spaces. It was very
interesting to learn about your project. Thank you, Mike and Adrian, for answering those questions in very good detail.
It's good to get some answers out to the community.
And I think, so next Friday, sorry, just as a last thing, next Friday we'll have Anoma
Day and that's going to be streaming.
So check out these socials to learn the latest on that.
And yeah, thank you very much everyone's tuning in
um it's been very good thank you thanks thank you Thank you.