Future of Decentralised Compute

Recorded: June 9, 2025 Duration: 0:43:39
Space Recording

Short Summary

The Future of Decentralized Compute Twitter Space unveiled several innovative projects, including Fluence, Ether, Edge Network, and IONet, all aimed at enhancing decentralized computing capabilities. The discussion highlighted significant growth opportunities driven by the rising demand for AI and decentralized solutions, emphasizing the need for improved user experiences and strategic partnerships.

Full Transcription

Thank you. Hello, hello.
Hello, hello. How are you? Very well thanks.
Very very well.
Please share the list.
Hello. Hello. welcome everyone we'll start a couple minutes just wait people to come on Hey there, you guys hear me all right?
Hey Tom, yeah, good to hear you loud, please.
Okay, great, fabulous.
I'll bet you're here as well, right? Okay.
Okay. Okay. Okay. Okay. Okay. so Okay. Okay.
Okay. Okay, I'm sorry I lost you again, say it again.
I'm just saying, Kate, you just need to apply to BSVS. Oh, great. And then we can kick off.
Here we go. Okay, I think we're good to go.
Kate, can you hear us?
Well, welcome everyone. We'll get kicked. Yeah, we'll start now and we'll just wait
for more content as we go. But yeah, welcome to the Future of Decentralized Compute Twitter
Space by WeAreDecision. This is the third one over the last few months, very, very excited to do this one.
It's brought to you by a minimum and for a long time now.
You know, we've introduced Palm, we've introduced Aether.
So yeah, it's been a very, very good podcast.
Before I kick off, it'll be good to go around and just basically introduce yourselves.
Tom, I'll start off with you.
Great. Listen, thank you. Hope everyone's going to hear me all right. Great to be here.
Tom Trowbridge, co-founder of Fluence. I guess just, you know, Fluence, for those that aren't aware, we are a decentralized compute platform.
So I think that's probably somewhat similar to most people on here.
We're focused on CPUs at the moment.
We can talk about that later.
Before Fluence, I helped found the Layer 1 Hedera Hashgraph, HBAR, which is, I don't
know, a top, it changes all the time, but around a top 20 currency.
So I've been kind of full time in blockchain and crypto since about 2017.
So I've been doing this for some time.
Thrilled to be here.
Very, very cool.
Artleth, good to see you next.
My name is Artleth.
I'm head of ecosystem at Ether.
So Ether is an enterprisefocused decentralized GPU marketplace.
Previously, I also come from a L1, L2 background.
I headed Enterprise BD at Polygon and then headed this region,
South, Southeast Asia and Middle East as a Managing Director for Nair Foundation.
I have been full-time in crypto for about four years now.
Very cool. Joseph, I'll go to you next.
I'm Joseph Den, co-founder of Edge. We founded back in 2013, started decentralizing in 2017, which makes us OGs, I guess, in the space. Edge Network is the cloud but decentralized decentralized in terms of compute organization and economics.
We basically put cloud services into the unused capacity that exists in the devices all around us.
And then last but not least Kate.
What a great team we have here today. Great to talk to everyone, and I'm so excited for this discussion.
My name is Kate, and I lead partnerships at IONet.
So IONet is a D-PIN providing GPU clusters and really democratizing that access for any developer anywhere in the world, wherever they need to deploy a GPU cluster.
Yeah, really excited to dive into the discussion here today.
A question I want to ask all of you guys, more individual questions, is, I guess, talking
about computer-based compute specifically and more generally, what's crucial?
And why are the bottlenecks right now that the existing computer, a decentralized sort
of model can remedy?
And I guess, I don't know, any one of you guys can give me first.
In terms of why it's so important, number one for us, I think, is this concept of sovereign
computing. Authoritarian
governments love centralized tech, and we live in a world where really a handful of companies
control 80 odd percent of all of the traffic on the Internet. We need an alternative. So
in the same way that Bitcoin is freedom money, think about it like that, open permissionless
programmable, we need decentralized
compute protocols to underpin that and ensure freedom of expression and there's a bunch of
other things that go along with it the other biggie from our perspectives is the environment
particularly as we see kind of exponential growth in ai gpu requirements there's huge growth in
infrastructure requirements particularly in data center
construction and use.
But the spare capacity that exists all around us, so in phones and laptops and set-top boxes
and the like, is five to six times the estate of all of the data centers put together, and
it's sat there unused.
So decentralized compute has the ability to make that accessible.
Yeah, so those are the reasons we would see it as being decentralization as being crucial to this space.
Yeah, listen, I hope we disagree on things that somewhat in the call is at some point because it makes it more interesting. But in terms of that, I think there's sort of violent agreement.
And I would phrase it just, I would take a slightly different look, but sort of emphasize
it, but phrase it a little differently, which is to say, we have decentralized payments,
whether you want to call it Bitcoin, Ethereum, Tether, whatever, not Tether, but Bitcoin,
Ethereum, whatever it is, right, that it works.
And we have decentralized storage in terms of, you know, Filecoin, RWE, a whole bunch
of others, but we don't have decentralized compute.
And if you don't have decentralized compute, we're right back to the same problem of relying on centralized systems with all of the challenges and vulnerabilities that those present.
And so the hidden secret of all crypto is a lot of blockchains run even in the centralized cloud, right?
So compute is the bottleneck and compute is the
piece, the vulnerability of this whole decentralized world. To put an even sharper point on it,
without decentralized compute, we really are not that, we're really not decentralized. If we're
not decentralized, we are effectively tools of the cloud, which makes us effectively tools of
government. Without decentralized compute, you could argue we're really not going to have
a decentralized world, which really means we're all going to be subject to kind of the
strongest, most powerful government out there.
Digital serves, I think I've heard it mentioned.
mentioned. Absolutely agree with all of that.
Absolutely agree with all of that.
So I think
I have not heard a better
version of what Tom just said.
So I think this is the fundamental piece, right?
So without decentralized compute, there is
essentially no crypto.
So adding to one point that
Joseph made, so from our perspective,
especially on the enterprise compute
side, so we have also seen that there is a
huge unused capacity so if you look at it like all the ai development around us it is unlocking a lot
of demand right people are people are going to all the extents to get gpus and hoarding h100s and
h200s and all the advanced chips and at the same time, there are data centers where the utilization is less than 20%. So you see these two problems are part of the same thing. On one hand, we
have a huge underserved demand and on the other hand, we have underutilized capacity.
So that's where a platform like Aether, which brings this operational efficiency, really is really critical.
Anything to add, Kate? I think you guys really did nail on that.
That's why we get the big bucks.
Yeah, and I think it's also something that we haven't touched on yet is when it comes to sort of interacting with these hyperscalers as well, their minimums just to talk with a provider are so enormous and cost prohibitive to most builders as well. reason why this industry needs to exist today in order to just empower any developer anywhere
to create and build on GPUs. Yeah, Kate, you mentioned bottlenecks at the head. You know,
Tom said the vast majority of crypto projects are on centralized cloud platforms. That's like 75%,
right? I think
initiatives like the DeepM pledge are helping here, but there's a long way to go with education and
just helping ourselves. And then, you know, just on what Kate said, scale is a thing too, right? I
mean, the biggest players in this space, Google, AWS, Microsoft are so big and they hoover up so much of the capacity,
particularly if you think about AI and GPUs and the costs that go with that,
that there are very large barriers to entry.
So the fact that we're coming at it from a radically different perspective,
you can compare a lot of what we do to those giants,
as in the net result for the end user, the customer is essentially the same,
but it's a fundamentally different approach. And it kind of has to be because of the scale of those,
you know, those of those players. I also think that's a key reason why it's so important that
we come together, you know, under initiatives like Deepin Pledge, many others work together to,
you know, give us a stronger putting as possible. So Kate mentioned, I think cost savings,
Arpett mentions general efficiency.
I want to ask more specific around those points,
I guess the benefits that come with being decentralized.
And you mentioned AWS, Google, sort of existing guys.
I guess my question is,
what is stopping decentralized from growing Google, sort of existing guys. I guess, like, my question is, what is stopping decentralized from growing,
given those sort of costs and efficiency benefits?
I think it's important to define what we mean by decentralized, right?
So if you look at edge computing, generally,
not our project edge, but just edge computing,
it's growing really, really rapidly.
It's growing faster than the cloud at the moment,
but most of those deployments are coming from you know from the big players so you
could say that from a computing perspective decentralization is kind of happening and
growing and there's growing awareness around it but you know if we take it as pure decentralization
so you know the freedom to interact and interoperate um in a permissionless way, yeah, there are huge, huge barriers to that.
And I think personally, one of the biggest is the complexity that exists in user experience,
particularly around the payment side. But, you know, also stems from things like stability,
you know, bigger businesses having confidence that this newfangled technology is going to
stand up and stay online.
You mentioned crypto wallet complexity and the like.
So there's quite a bit there from a UX perspective that is getting radically better, but there's
a long way to go.
And there's education that goes alongside that, I think.
And I guess I'd add to that it's not UX is clearly an issue, but it's not, you know, it's not just UX.
It's actually, you know, a whole software stack of that is basically attuned to various compute needs.
So compute, as much as I wish this weren't the case, is not just a commodity that you can swap in and swap out.
It's actually service. And so I wish it were just a CPU that you could swap in someone from the
cloud for your CPU. But the reality is the cloud offers a whole wealth of services. And now some
aren't great and some you're overcharged for and a whole number aren't necessary, but they have still spent the main cloud
providers, you know, two decades and tens of billions of dollars optimizing the, you know,
the compute service offering to be appropriate for a whole list of specific verticals. And so that is not something you can easily either,
you can easily develop a superior alternative to.
You can, the way you do it is you can chip out
different aspects of that and different use cases
and build services that are competitive and cheaper
and better in certain verticals and grow from
there. But compute, as much as it sounds like just a one type of commodity service, is actually not,
and is actually very complex to offer broadly the way the cloud does.
Yeah, that's absolutely right. We shouldn't beat ourselves
up about it though, right? Because AWS, obviously, we are in a green field for probably the first
decade of their existence, and it took a good decade to get to anything that you would consider
scale. So we're pretty early in the game as well, right? Oh sure and and and and easier for us also because there is a
roadmap they had to create the market right so they had to both build stuff and figure out first
figure out what people want to then build it and keep tweaking so at least we can even there's a
path now so it should be easier for us but we also don't have the resources they do so while we have
a path it's still going to take us effort and a lot of
ingenuity to compete with the resources they have spent in building some of these things.
But we've got open source there as well, which should give us the real legs up because obviously,
as we know, they rely a lot on open source as well. So it's sort of, as you say, it's a bit of both.
well. So, you know, it's sort of, as you say, it's sort of, it's not, it's a bit of both.
Yeah, completely. I mean, I think they, AWS's genius was taking the complexity of on-premise
and shifting it to this newfangled thing called the cloud, which made things radically easier.
But as you mentioned, they've got so many services actually ended up being radically
more expensive as well. And now we've got the opportunity to really essentially get back to on-premise but in a
completely open free form permissionless way which I think is pretty exciting I think we've
just got to find the right sort of lenses to help shift that perception and understanding in the
marketplace. Yeah so I have a couple of points to add here. So I do agree that because AWS and GCP, they have a big...
So I would, rather than service, I look at it as a big ecosystem of offerings that they have.
So for example, if an enterprise goes to AWS for their all kind of infra needs,
they just have to go to AWS and they get their storage,
they get compute and a whole lot of models and everything at one place, right?
So that's that's
definitely a big factor which creates that lock-in especially with the bigger clients that's one
second factor is sheer scale of the sales organization that these companies have today
right so practically between GCP Azure and AWS the the truth is that their sales team is probably plugged into pretty much all
the tech buying centers globally. So that's a big barrier overall from sales perspective.
But there is a nuance to it when it comes to GPU that I have observed. So I have been part
of multiple sales conversations. And we have not one time lost due to cost so due to price right
so we are like 50 to 80 percent cheaper than all of these bigger companies AWS and GCPs of the world
and I have seen that specifically in the GPU people are willing to quickly switch their providers
because just just because of the cost that is associated with
that specific compute. So I have seen people make switch especially for higher price products like
compute. Yes, it would help to build a stack of ecosystem or services on top of it. For example,
on top of it for example not everybody is comfortable enough or advanced enough to
to handle bare metal gpus right so they need services on top of that so that definitely will
unlock the next level but i definitely see hope so we are we are growing pretty fast so we just
hit 140 mil arr all from service fees selling gpus so think there is, so we are, and of course, you know,
the space overall is very early in this whole thing, right?
So this will grow eventually, yeah.
So just to reiterate, you're saying it's less of a product problem
and maturity problem, but it's more of a distribution and sales problem.
Yeah, so Tom, so that's one of,
I feel that's a big part of this
because sales and distribution is really critical, right?
So if I am not able to reach the clients,
I can't really convey my value prop, right?
So if I have my sales plugged in
and my distribution channels developed,
then this will definitely unlock a lot more demand
than what we are seeing now.
You say AWS has 60,000 salespeople, something like that worldwide.
So that's the scale.
You know, just to give you numbers,
Aether has like 150 people across all the different teams.
Yeah, there you go.
So, you know, we have the viable alternatives from a technical perspective today,
although there's obviously many challenges to crack.
But I think it boils down to a question of reach,
it's marketing, it's education.
Yeah, and I'd also think here is where it's worth
articulating distinction between CPU and GPU.
And so most of the business of the cloud
that I think Joseph and I were talking about is CPU related.
And that's what they've spent decades building.
Those are the services that they've invested in.
The GPU marketplace is much newer.
And so our focus of Fluence has been on the CPU side of things.
And there we're competing with much more mature service offerings, many of which are mission critical to businesses.
And so it is a significant proposition to shift your business's compute over because you are
effectively shifting your business over. And so you can do piece by piece of that. It's a little
hard. And I think that the GPU side is obviously businesses will not be able to run without AI quite shortly,
but I think it is a little bit, I don't think the service levels have been as invested in by the clouds as the CPU side.
So it's a little bit of a different market, even though obviously both are compute.
That's a fair point of yes, absolutely.
Yeah, I would say that one of our big focuses right now is just to everyone's point,
like creating that brand presence as well. So the Google brand, AWS brand ubiquitous,
like household names.
And so this is a huge part of our personal goal
is just really to build more awareness in the space,
especially with the founders, enterprises alike,
about Ionet, the decentralized AI space.
I think Arpit mentioned that once you do have these conversations with teams,
it's quite easy.
But just making those connections and just providing that alternative
is a huge goal for us this year.
Now that we've, I think, especially focused in 2023 and 2024 on building out the supply side and building out sort of the tooling surrounding IONet.
Now it's really a go to market motion that we're quite focused on.
Yeah, it seems you guys all agree in terms of the,
so this is a distribution problem.
So if we dig deeper into that, I guess like,
and I know Tom, you spoke about verticals specifically.
What are the verticals, what are the sort of types of companies
that are sort of the most warm to decentralized compute
or is it pretty much sort of everyone that you speak to?
Sorry, I missed that keyword.
What are the most, the companies that are the most, what?
The most receptive to a tool like this.
Ah, listen, so the way I refer,
what we're doing at Fluence is quite sort of easy at this point,
which is we're focused initially on third-party node hosts.
So these are businesses that individuals like consumers and also some companies outsource
their node hosting to.
And so these are businesses that basically run blockchain nodes as a service and so they are both price sensitive um relatively sophisticated and also um have at
least a marketing interest if nothing else in decentralization and so for us that's the perfect
market because we can give them decentralization and save it between 20 and you know 80 percent
of their current bill for running their nodes, which usually happen right now in
cloud CPUs. And so are that market give a sense of scale, it's hard to gauge it exactly as anywhere
from, you know, 500 million to a billion. So it's a relatively small piece of the cloud market,
like a very small piece, but still big in and of itself. And that is what our pipeline is focused
on right now. As we chew through that, we will add, you know, then we'll go to L1s directly, run nodes for them and keep
moving. So we're effectively focused on one segment that's quite receptive to us that were
we to get any material market share would make us still a massive success. And then as we do that, we'll be able to add additional, you know,
additional verticals as well. But for right now, that is the most receptive client base
that we have found for what we offer.
And I'll put Joseph Cates, provide your side in terms of...
Yeah, so for us, in terms of market segment, so I think startups and SMBs are probably
a good fit. There are two reasons for that. So first, they make decisions quickly. The
buying decisions are quick with these teams when compared to enterprises plus price sensitivity is pretty
high and the inertia to move from a cloud operator like AWS to Aether is comparatively less than an
enterprise client while we do have enterprise clients also and I have seen like such teams
the moment they will exhaust their GCP credits aws credits they will immediately you know get in touch and move their ai loads to aether so i think that's that's that's on the web 2 side of things
in web 3 we are now very actively tracking this ai agent space which is rapidly evolving
like which was even faster a couple of months back and now kind of consolidating. So that's a focus area for us.
We have seen good success in that space.
Since December, we have been running a grant focused on AI agents
where we have seen hundreds and hundreds of applications
and have 30 teams which were provided with compute grants also.
Plus, we are also working with other L1s and L2s to create joint grant tracks to funnel in innovative AI
teams from them.
So basically, in Web2 side startups and SMBs
and on Web3, mostly AI agent projects.
Joseph, Kate?
And we have relatively broad reach, ranging from individual developers through to pretty
large enterprises.
And on the higher end, in terms of enterprises, we tend to work with businesses that are actively
looking for disruption.
So they're looking for something new.
They can see what's happening with AI.
Maybe they've got slower internal teams.
You know, they want shaking up a little bit.
So we do quite a bit of that.
Whilst, you know, obviously trying to be as dev-friendly
and also I should say now as bot-friendly as possible.
So, you know, everything's API first
and we're setting ourselves up to be able to be run by machines directly,
which I think is going to be a fairly
radical shift over the coming months and years yeah i think that our for io net i think for us
it's evolved a little bit um so originally i think our two big tranches of users were, you know, anyone within the decentralized AI space,
and then also within generative AI on the Web2 side, anything from like a startup all the way up
to sort of an SMB or sort of a breakout team. But in many of the conversations that we had
with some of our prospects, we realized that some teams, especially like newer ones, maybe may not have the internal resources to manage their own GPU clusters.
And at the same time, we also saw that demand for inference is really skyrocketing in the last few quarters.
is really skyrocketing in the last few quarters.
So that's when we also introduced IO intelligence,
which is our inference API and sort of like model garden,
which provides access to all sorts of like open source models as well.
So we've tried to keep our ears to the ground to what the users need
and try to build out new tools and solutions especially to address anyone that's
building on the agentic side and just needs an easier way to plug into DeepSeek for instance.
Yeah it's interesting those last three answers and Tom your answer is very interesting to that
and let me write you off for anything there but Arthur you mentioned how you guys have
put together sort of an AI agent grant and then Joseph you mentioned how you guys have put together an AI agent grant.
And then Joseph, you mentioned how you guys were gearing up to be AI agents.
API focus first for these sort of AI agents to interact with you guys.
And you mentioned the same sort of thing.
I wanted to ask you in terms of, I guess, growth sectors in the next sort of medium
to long term. Is this really key to the decentralized
compute wave? I believe it is. We're seeing LLM getting into basically every business around us is radically shifting the development space.
And we're seeing a lot of demand for and kind of pushback on the more centralized kind of AI providers. So, you know, latest kind of GPT stuff has shone a bit of a light on that
in terms of a judge demanding that they maintain
records of every conversation. I think what Venice AI have done is fantastic. There's definitely
scope for decentralizing LLM use entirely. We have a working build of that in the network,
which we're pretty excited to be bringing to market pretty soon.
So yeah, I think it's kind of fundamental.
It's rewiring basically everything that we do.
I don't think enough people are talking about it.
I think we have a coming jobs crisis because of it.
It's going to upend all of society
in fantastic and exciting and terrifying ways.
in fantastic and exciting and terrifying ways.
Sorry, I'm not sure if I got the question correctly.
So in terms of volume, I feel that Web2 side will unlock more demand
in the next few quarters just because of the AI activity that we are seeing there
in traditional businesses where the way things were used to be
are changing very rapidly.
So I think volume of the demand will come from there.
But exciting use cases, I'm pretty sure,
will come from the AI agent space, which moves really fast.
But in terms of number of gpus that they need that is
still very very low when compared to traditional businesses
okay sounds interesting i want to go around um because i'm conscious we haven't spoken
about specific questions on on each one of your your businesses so um let's let's go around i
guess and talk about that sort of medium to long
term where do you guys see yourselves playing and tom i'll start you know i guess um this
repeats a little bit of my my comment on the traction we have right now which is third-party
node providers and so that is not as sexy as some of kind of the other
areas that people are talking about, but I think it's a real business that can really scale and be
terrific for us. The next vertical is obviously blockchain nodes themselves. And then we're also
working simultaneously on the AI agent side as well. And so I think what people realize is agents need a lot of compute
and they need a lot. And that's both, there's two different kinds, right? There's both the GPU side
of that for training, and then there's also CPUs for querying and running. And so I think that is,
we're going to see agents need both of those. And I also think we're going to see agents scale
dramatically and be embedded everywhere.
And so agents will be customer service agents will be traveled.
I mean, there'll be absolutely everywhere.
And so pretty soon, I think AI agent won't even necessarily be a standard offering that's embedded into every business's kind of
operations, if you will. But for as long as it is, because it's going to become more dominant in that,
but until that is the case, I think agents will be a certainly a growth area for us as I'm sure
it is for others on this call as well.
Okay, so I'll go to you next.
Yeah, so I think that, you know, we'll just continue to focus on our mission to make compute available to any developer anywhere around the world, no matter the size of their project or organization.
And kind of also expanding on what we've been talking about today, especially as we see every single business adopting agents and LLMs, I think we're also going to see more of a need for the personalization of models as well. say fine tune, fine tune a model with your own data or build in rag so that you can root your
LLMF answers in truth. So trying to think about and anticipate what our developers are going to
need in the coming quarters in order to build truly seamless agentic experiences for their customers.
truly seamless agentic experiences for their customers.
Yeah. So for us,
I'll divide the priorities in three parts.
So on the GPU supply side,
we are focusing on onboarding more and more data centers to the network.
So that basically entails working with different data center segments.
So these can be traditional tier 3, tier 4 data centers,
some smaller data centers which have some crypto understanding,
and then miners.
So working with them, providing more education
about what this whole Deep know deep in is all about
and what kind of rewards they can expect so that's a big focus for us secondly on the demand side of
my personal focus is on developing more and more distribution channels especially from the crypto
side so we will be working more closely with layer ones and other bigger ecosystems where there is a lot of AI activity
so that we have a robust web 3 AI agent demand.
And then of course, you know, the web 2 enterprise team is focusing on the traditional side of
And then third, which is a really key priority for us is to solve for the financial layer
of this whole deep in space that we are in.
So for that we have, for example, recently partnered with Eigenlayer where we have opened
up a pool liquidity pool for retail investors to participate in.
So now retail investors can participate in that liquidity pool and that those ATH tokens
they can be used for data centers who are onboarding their compute on network.
So basically this lowers the
in the network.
And at the same time also brings in more and more retail investors into this
space. So that's another key focus area for us.
We have a core focus on user experience and improving our own interfaces at the moment,
which is all about lowering the barrier to entry.
And alongside that, our other big focus is AI operations, so decentralising LLMs alongside
a robotics offering, specifically supporting how discrete robotics devices can work together in
the real world, so the understanding of their environment. And finally, we have a dev track
for physical devices, so we're going to be bringing a series of dedicated nodes to market
designed to give assurance to compute in our network and in other networks,
while also kind of underpinning that sovereign compute
that I was talking about the head of the space
by lowering the barriers to entry and opening the doors
to those devices for essentially any other crypto project
to run their own nodes through a simple marketplace.
So I guess the common track there is lowering the barriers to entry,
making it as easy as possible to, you know,
on board yourself and your business to decentralization.
Very cool, guys.
We're getting to the end of the space.
We put it in 45 minutes.
It's flown by.
Last thing I want to ask you guys before we wrap up is,
what is one thing you want the audience to sort of take away
when it comes to looking at decentralized computing?
The biggest sort of learning for me,
I guess the prominence of AI agents going into the next, you know, medium, long term, which is amazing to hear.
But what do you guys think?
So one thing that I'm taking away is that right now it's fine, but eventually we need
to grow a stronger ecosystem of services and products which can support a client in all of their needs.
So it can be a general purpose computer, GPUs and storage.
So all of this has to come together and the user experience has to be really easy for clients to really onboard.
So we can't have the broken user experience
that we have seen in crypto for so many years now.
So user experience is definitely going to be the key
for onboarding this new set of users to deepen.
I really like that answer,
and I kind of want to sort of piggyback off of it.
I think we should all and
and i'm sure something we're all doing is really focusing on um making sure that in our daily lives
we're dog fooding our products and and really focused on um testing out and fully utilizing
all of the decentralized ai tooling out there. So yeah, I've cut the cord
on all of my like chat QPP subscriptions, et cetera, in order to really just focus on
using IO and combination with yeah, other, you know, vibe coding platforms as well. So
if you're sort of passionate about this space, make sure that you're, you know, using all of the tools in your tool belt and really digging deep and learning, because I think this is going to take off like wildfire in the next months.
hard to do one but i guess i would say two things one that
decentralization isn't successful without compute as i mentioned earlier and so
i think it's just a question of time before this is an enormous market and you know these projects
on the call will be you know at a tiny tiny success level will be huge. And so I think that is kind of one
thing. It may take time, but I view it as sort of nearly inevitable. And so that, I guess, is the
most important thing is like, this is huge. It's happening. There are reasons for it to happen,
right? There's reasons why it's cheaper. There are reasons why customers want it. So it is not a kind of solution in search of a problem, but it actually is solving
a whole number of very critical problems. So I guess if I isolate it down, I'll just say that
this is enormous. It is growing quickly. And this is, you know, if you, let me think about,
back up one second. If you want to be in one sector, what do you want? You want a big market that's growing quickly,
that has unhappy customers. And that's what compute is, right? And so to me, this is the
sector that you want to focus on, whether you're an investor, whether you're an entrepreneur,
whether you're a builder, whatever you are, this is the place to be.
100% agree with that. The technology exists for a different world today, right? We just have to whether you're a builder, whatever you are, this is the place to be.
100% agree with that. The technology exists for a different world today, right?
We just have to opt into it collectively.
So join our networks, contribute your capacity and you'll reap the rewards.
Yeah, absolutely fantastic answers, guys.
Really love. Guys, we're going to wrap up the Twitter space now,
but it was such a pleasure having you all on.
And thank you to the audience for listening.
Thank you so much.
Follow the guys on the call today.
They were absolutely spectacular.
And, yeah, turn the notifications on for us on Twitter.
We're going to do more of this.
We're going to do more of these in the future.
And, yeah, thank you so much again.
And have a lovely day.
Great. Thanks so much for hosting.
Appreciate it.
Yeah. Thanks for having us.
Take care.