Music Thank you. Wait on everyone to join.
I know you have some questions after.
Good to see you again. Thank you. Thank you. Hey, guys, giving everyone a few to join.
We're a bit early, so opening up the space so everyone can get in. Thank you. you mad good to see you my man hello hello good to see. Good to see you. Good to see you.
We're a bit early, so letting everyone jump in.
But excited for this convo.
It'll be, been thinking about this for all week.
And diving into what Flock is doing, as well as 375.
But you guys on the sort of federated learning side, I mean, it's one of my favorite projects. So excited to dive in. Great. Yeah. Yeah. Great to hear that. And thanks for organizing.
Yeah. Definitely looking forward to having a great conversation today. Yeah. Awesome. Are you in New York or where you're based? I'm just in upstate New York. Okay, fine. Yeah. How are you?
Awesome. Awesome. I'm in New York as well. Does Flock have a big American presence?
We do not. I am the only U.S.-based team member.
We have a team of about 30 full-time contributors, and so we're spread across.
We've got a big presence in London and Hong Kong as well, but we're distributed globally.
Carrying the flag for AI in the U.S.
Sure, I'm good to see you over should have invite. Oh, oh.
Waiting on Harry to join.
Look, I'm excited for this one.
This is a really unique kind of intersection of both.
I know 375 pushes things to the edge.
Flock does a little bit of edge.
We'll dive into what federated learning is and all of that.
But this is, to me, this is sort of a match made in heaven that I wouldn't have seen, obviously, but glad to get both teams on.
Oh, same here. I'm, like, really hyped.
I had, like, a really fun time coming up with a lot of the questions, too.
We're going to wait a few minutes,
start probably five, ten past the hour.
Just letting everyone join in.
You all, give me a like if you're tuning in outside of the U.S.
I'm curious where the audience is coming in from. You all give me a like if you're tuning in outside of the U.S.
I'm curious where the audience is coming in from.
Okay, we may have a bunch.
I would love to hear a question from you guys towards the end.
That's a really interesting deep end protocol.
Look, we have a pretty global audience. Anyone at ECC? Give me another interaction if you're here.
Look at you guys. Y'all are hustling.
Awesome. Awesome. Good to see
Awesome. Good to see everybody.
Waiting on Harry here. We'll get started here in just a second.
I hope everyone brought their pen and paper to do some deep diving.
I mean, we have two fantastic teams on who are doing substantive work in both DAI and D-PEN.
So, you know, that's what we're all about here at Vested.
And so, you know, I'm excited to have a convo that goes deep into work that can impact the
real world and solve real problems within Web2.ai and for, you know, traditional infrastructure
Yeah. for traditional infrastructure like 375. Thank you. Thank you. I see the comments from everybody under.
We're going to give Harry. We're going to give Harry
going to give Harry a few minutes
good to see you unleashing deep in
look I'm going to let you come up really quick and say hi to everybody hey my man
what is going on everybody
good good to see you feel free to give a quick plug on Unleashing Deepin. Love folks doing substantive work in Deepin. So feel free to say hi.
to chat. Yeah, we're the Unleashing Deepin podcast. So we have been chatting with founders
in the space for almost three years, which is a little bit wild to think about. We've also
published a lot of different research around the kind of the financial implications of how to do
tokenomics models and valuation reports within the space. So we're all in on Deepin. We just had
Spill a Little Alpha. We see Harry up here. We just did an awesome recording, which will be out very soon.
So, yeah, that's kind of who we are and what we do. Thanks for having us up, Joshua.
Love it. Love seeing you guys. Okay, Harry, good to see you.
Good morning. Good morning. How are you?
Good. Are you calling there from the UK as well?
Oh, no. Don't let the accent fool you.
No, I'm in North America.
I've been meeting more and more people from Canada or living in Canada, which is crazy.
I love it. So, hell yeah.
Love it. Good to hear everyone.
Okay, let's go ahead and get started, guys. So, hell yeah. Love it. Good to hear everyone.
Okay, let's go ahead and get started, guys.
So, quickly, I'll introduce myself.
I'm Joshua Howard. I'm an investor at Red Beard Ventures full-time.
And Vested is a passion project and a growing community focused on DAI, D-Pen, and D-SOP.
So, the emphasis there started it maybe two or three months ago,
and it's grown to over 200 top investors, builders, and researchers in those three categories.
Have a huge network of top tier C-suites, high networks, investors, et cetera. And we're hosting
conversations, publishing think pieces, all about those three categories. We fundamentally believe
those three categories are the most exciting in crypto for three reasons. They're expanding Web3
TAM, they're solving the demand side problem, and they're bringing Web2 builders organically
into Web3. So I founded and have a great team helping with Jeremy, who will be hosting today, who's also an investor,
Reese's name, who is here today as
well, who has done some great work
both on the research side and helping
coordinate this event. So, Jeremy, look,
I'll pass it off to you. He's gone deep on
each of these, so this conversation will be
And pleasure to meet everyone here.
Also, pleasure to officially meet you, Harry, as well as Matt.
So I guess let's start off with a quick introduction.
So let's kick it off here.
Please give a quick introduction of who you are, what you're doing at 375, as well as what you guys are
And same goes to you, Matt.
Nice to meet you too, Germany.
My name is Harry Dewhurst.
I'm the co-founder and CEO at 375AI.
AI. I've been an entrepreneur my whole career, started my first company 15 years ago, quite
I've been an entrepreneur my whole career.
quickly built and sold that, moved to Silicon Valley, started another business there which
I went on to sell to Singtel, the large mobile operator in in southeast asia that was a data business um
understanding what people do on their phones and and this was kind of all at the rise of the
mobile internet and and and apps and so on um i then was president a location data business, which has got a lot of synergies with the physical AI space.
I guess a very primitive version of it, whereby we use signals from mobile devices to ascertain people's movement.
movement and we're able to use that to help retailers, shopper marketing and so on do a
better job at reaching the target audiences. Lastly, before founding 375AI, I was the CEO
of Linksys, the Wi-Fi router company, which some of you may have heard of or had those devices in your home.
That company was founded in the year I was born.
It was one of the kind of pioneers of Wi-Fi originally,
and I was brought in to kind of rejuvenate that business. But it was there that my now co-founder, Rob, and I crossed paths and discovered projects like Helium and what is now kind of, I guess, deep in, but certainly didn't have a term or a name back then.
And we left to start 375 three years ago, almost to the day.
And 375 is a data platform.
We're collecting huge amounts of real-world information,
Vicula, and people movement data from an array of sensors, cameras,
and devices that we've deployed across the U.s um to digitize the analog so turn
um what would otherwise be you know analog real world information into digital digestible information
hope that wasn't too much of a mouthful no that was great and also like happy um
anniversary as well that's great and also like happy um anniversary as
well that's great are you gonna like celebrate with the team or anything uh they're all coming
up to to uh to the house that i've got in canada um i've got a lake house up here so um me my
my team are coming coming up here on sunday so i'm looking forward to that very much we haven't
all been in the same place ever at the same time or at least uh the the the the the key guys so
we've all kind of met each other individually but it's never all been at the same time so i'm very
excited about that as well um as they're coming in you're going to put an edge just to clock the cars and be like, ooh, what's going on over here?
in the countryside, so there ain't
many cars driving down and the
driveway is about 5 kilometers long.
375 Edge. There won't be much data to capture around these parts.
Well, at least a couple of squirrels.
So now we're going into the D-size space now with that.
So, Matt, you're next, bro.
I am the VP of growth at Flock.io.
Flock is a decentralized AI training platform.
We went to Maynet and had our TGE at the very beginning of this year.
The company has been around. It was building in sales since about 2022. from those engineers, we call them training nodes,
The company has been around.
It was building in sales since about 2022.
so the submissions from the training nodes are
validated by a group of distributed nodes,
a network of nodes called our validators.
So they will look at the model itself that was
uploaded to the platform and come to
consensus on the loss score for it.
At the end of the training process,
then those models have been available,
fully open source, available on Huggy Face.
And then we offer additional fine tuning
and full stack AI application development on top of that.
So hopping in here, so we broke this,
so we broke this down in a couple of segments.
So we're going to go into like technical, regulatory, privacy, the whole shit bang.
So yeah, this is going to be amazing.
So the first thing we're going to start off with,
since both of you guys are obviously in the AI space,
what attracted you most to it as well as,
actually what excites you most?
And also, since we do have a centralization issue,
I would love to go in and pick your brains on
how can we eliminate certain biases?
For instance, Matt, you just said that you guys train AI data.
I'd love to know more about how you guys are solving that
issue on floxin and yeah let's just start from there
yeah sure so i could start so uh personally so i've been working in blockchain crypto for the past uh almost five years uh i started out working uh at a infrastructure i started out doing freelance
blog writing covering the mining space um mostly the Bitcoin mining space, mostly.
And then took a full-time job with an institutional infrastructure provider, where I helped to stand up our institutional staking business.
Right around the year, just before ChatGPT came out, sort of like the ChatGPT moment,
I was starting to get interested in AI more seriously,
and decentralized AI specifically, given my background in blockchain.
And so what really excited me about decentralized AI was a few things.
One, we are able to incentivize a distributed group of members across the globe
to participate in AI training and reward them for it. And also, there's the user ownership and
the sort of like the data, like having sovereign data, which I think is incredibly important.
That's what really attracted me to it.
And so that's really attracted me. These are big, hairy problems that we're working on.
These are big, hairy problems that we're working on,
but it's exciting because it's a future wherein users,
the technology that we're using,
that we're co-creating together,
we also are being rewarded for it,
and we're controlling the future of it.
We're not necessarily just at
had to follow all of the centralized AI providers.
That's what really excited me for it.
Just the idea of it's extremely cutting edge
and definitely rewarding contributors for their work.
Nice. Any hard questions you're hearing?
I mean, in the space that we're, you know we we didn't necessarily start out as considering
ourselves an ai business but um as time has progressed it's certainly becoming more and more
um central to what we're doing we're certainly using a lot of it um and and none of what we're doing would be possible without it.
There are businesses in the space, in the centralized space that are capturing similar kinds of data.
And those businesses are really kind of flourishing.
So we'd certainly like to see how we disrupt that in a centralized manner and can outpace their growth um you know
with the community so um you know that's really what attracted us to the d pin d pi
whatever mod decide whatever thing you want to call it.
But, you know, it's the ability to grow and scale a network at an unprecedented pace and give, I guess, the legacy and incumbents a run for their money.
I guess the legacy and incumbents are run for their money.
Now that you said that, because one of the follow-up questions that I have is,
because right now you have first enterprise and legacy solutions.
They're using AWS, Google Cloud.
And since you're going, in a way, off the cusp and saying,
hey, we're just going to embed models on our devices,
what really made you decide to just stick and double down on that um on that initiative um so i mean
to be to be honest i mean we still use um where where necessary you know centralized compute resources um primarily for like that's
not our core business our core business is decentralized sensing and the gathering of
information and that's really kind of what we're focused on we're not a decentralized compute
platform we're not a gpu cluster um now the future, I imagine, will and are beginning to explore partnerships with working with, you know, decentralized storage, decentralized compute,
to mean that anything that we do off of our own devices is done in those environments to support, know the whole i guess decentralized community
however the vast majority of what we do is done at the edge on our device so the reason why we
chose to build architecture such as that was not because it was easy it was actually far far far harder um it effectively means that every single device and deployment that
we host has a mini data center sitting behind it on-premises with nvidia gpu running its own model
doing all the computer vision processing at the edge um but the reason for doing that is one, there are some efficiencies to be
had with it, you know, streaming six, you know, six 50 megapixel a second, 50 megapixel cameras,
24 hours a day, has up into a cloud has has cost associated to it but two more importantly is is privacy
and the evolving landscape of privacy um to mean that we can be best in class in adhering to that
um plus choosing morally what we want to collect and what we don't and therefore we we you know
once it once data has been extracted from you analog world, as I described it, that information is discarded.
We no longer need it. We've extracted that there were this many semi trucks that went past, this many Honda Civics, this many motorbikes and this many people walking by with a Starbucks coffee cup in their hand.
And once we've captured that into digital data, we no longer need the video and the video never leads the premises.
So, I'm sorry, Josh, you can go.
Yeah, just want to, Matt, I don't know if you had a thought there, but I wanted to ask a question I think that will hit that previous point.
And I want to dive into the sort of real-world use cases for 3.75 and Flock.
So Harry, one, to that point of having data captured at the edge from a deep-end device, et cetera, walk us through, one, why is that useful?
Who is it useful for? Like who's
using it? And on the Flock side, you all saw some core problems within the AI supply chain,
so to speak, which is commercializing your model or fine tuning your model, et cetera. So
walk us through what are the core problems in AI you guys are seeing on the Flock
side that made you start Flock? So I guess I'll start by answering my part of that question. So
yes, we're capturing information about what's going on in the real world kind of as it happens the the use of that data is is multifaceted
today our customers include businesses in the transport and logistics space which is kind of
an obvious one given that we collect so much transportation data also in financial services, hedge funds, trading firms
who are using that information to plug into their own models
to understand business performance as it relates to commercial vehicle movement.
So are there more FedEx trucks on this route versus last month
or FedEx versus UPS versus Amazon.
All that kind of data is very leading indicators to business and economic performance.
Longer term, we, or in the medium term, we anticipate selling this information into both the LLMs
to give them the eyes and ears on what's going
on in the real world they know everything about what's going on online
what you click what you search for what you buy but 75% of commerce happens
offline and not online yet these LLMs are totally blind to that so these LLMs are totally blind to that.
So these LLMs, be it the LLMs themselves or customers using custom parameters,
will make our data available to these LLMs in order to give them much more tangible and useful insight and information.
And here, let me ask quickly, I want to let Matt go.
So is it a physical device for our audience?
Is there AI happening at the edge of the device?
You know, walk us through that part.
So, yes, it's a physical device.
So the first product we launch is called 375 Edge. It's a physical device. So the first product we launch is called a 375 Edge.
It's a substantial device.
It's an expensive device.
We sell them for $50,000.
They have six very specialized high-definition cameras.
They have terabytes of storage, connectivity, as well as a whole host of other sensors embedded within it.
The device itself weighs 75 pounds.
They are mounted on billboards across the US and on some of the busiest highways in the country.
We've sold all of the devices that we've manufactured thus far.
We intend to sell a further batch of devices later this year, but also launch a smaller device
called 375 Street, which is another camera-based sensor with NVIDIA AI at the edge,
base sensor with edge with nvidia ai at the edge which um will be in the kind of 1500
type price point which anyone could deploy really anywhere and the more unique and
oh did we lose you harry oh i think harry cut out someone tried to call me i'm in a i'm in
high demand um so get out though um so yeah the the more um you know more more cars or people
or events that that these devices are placed and able to perceive,
the more net new data is being added to the network.
So that will also be launched and sold a little bit later on this year.
And then quickly, Matt, so for you on the Flux side,
what are the problems in traditional AI that you all saw and you were like, okay, we need, you know, distributed fine tuning
training model, you know, it's, it's the ETC to see.
So we are a few problems that, you know, we, we are seeing, you know, we saw in our,
in our scene is certainly, you know, there's, there's lots of data that's being stolen,
you know, models are being trained on user data and users didn't necessarily consent
That's, you know, This is often talked about.
We're squarely focused on privacy preserving development.
And so our founders have a very rich background in federated learning,
which is, for those who aren't familiar, it's basically a way to train AI models
where there are multiple data owners, which are called clients often, in a way that is distributed and doesn't, the underlying data is not exposed to specific, the data points.
It's just the model weights are shared and distributed.
and distributed. But that's very hard to do in traditional federated learning in a decentralized
and trusted fashion because there's often a centralized server to which all the model
weights are being sent. And so what we are, I guess the problem that we're trying to solve is
how do we create a decentralized peer-to-peer network that parties can trustlessly interact
with one another while also trusting that their
underlying data is being preserved in order to create a decentralized model that is still
high fidelity. It can be used in various contexts, but that the actual underlying data is never
exposed. So it stays local, it stays secure, and obviously stays private.
So it stays local, it stays secure,
and obviously stays private.
So I have a spicy question for actually both of you guys.
since you're allowing users to be able to capture traffic data,
how does that work in terms of just data privacy?
Is there a process that you guys have that will
almost like secure like people's data or like what type of information are you passing on
and also um the question i have for you matt is since i know you guys are doing um secure like
data processing but how do you how can i put this how do you really trust like the unaligning data
under it because i know like there was a couple like core cases and even some research
papers I was reading about, they're going into, you know,
how can we trust the data to make sure that it's not biased?
Are there protocols in place that will allow you guys to almost like shift
through that to make sure it's truly like unbiased as it's being processed
or it just come as you go um
not all matt why don't you uh go first i'll go second okay yeah yeah so i i started here two
questions in that one is you know how we how do we ensure that you know the quality of the data
the integrity of the data uh and then it's not like malicious data or stolen data.
And then the second question being more around the bias
that is potentially then surfaced in the model.
So on the data side, so we, like the Flux system,
the Flux protocol never actually sees any of the data.
So in theory, yes, there could be parties
that are participating, clients that are participating
as these federated learning nodes
who are contributing data that is stolen.
That's not something that because of the inherent nature
of the privacy preserving protocol,
like we're not able to check if that data is stolen
That being said, if there is malicious data or there's this thing called a data poisoning attack,
which can corrupt models,
the way that we fix for that is it's a distributed...
So we have a distributed network of these nodes.
When we propose a training task to the network,
the nodes who are participating,
we have basically two different types.
We have what are called voters and then proposers.
So the proposers are the ones that are,
and those are randomly selected.
And so the proposers are the ones
that are proposing new model weights
to be sent to the global model.
And then the voters are in a peer-to-peer fashion,
coming to consensus on what the best weights are.
And if there is, say, we detect like a data,
if the protocol detects like a data poisoning attack
or low-quality data, then the nodes can be slashed.
So there's an economic incentive and disincentive
baked directly into the protocol.
there's definitely bias that's sort of within every model.
I think it's impossible to remove anything
that is created by humans,
any system that's created by humans
will inherently have a degree of bias.
What's nice about doing this
in a decentralized distributed
fashion and an open fashion is that we have more visibility into the weights themselves. Again,
not the data, not the specific data points themselves, but the weights. And so we'd be
able to adjust the model and the way the parameters that are used to train the model
to sort of fine tune that in such a way to
remove bias. And again, because that is open and available to the public, that's more of a
community-based conversation as opposed to like, you know, a centralized executive team sort of
making a decision unanimously. That is the question? Yeah, that does. Thank you. I actually have a follow-up question right after Harry talks.
Oh, if you want to do your follow-up now,
I still remember your question, so I can jump in after.
Because, yeah, so one of the main things that I have for you, Matt,
is, like, so even though you allow people to, like,
upload it as you're computing it
how does it work in terms of like regulatory compliance now so if you're having these like
institutions let's just say america just decides one day under the administration like hey guys we
don't want um xyz type data or these type of ai companies like here, do you guys have a counter for that or you just feel like run with it?
That would ultimately be the liability there,
would ultimately fall upon the participant.
As a protocol that lives on the Internet,
anyone can download the client,
depending on no matter what your jurisdiction is,
like it's very similar to,
from a regulatory standpoint,
some jurisdictions may not allow Bitcoin mining,
but some participants, some entities may decide that, hey, we're going
to take this regulatory risk.
We're going to risk the fine because we see the potential upside of mining Bitcoin as
a risk we have the appetite to take.
So we at the flock level, and in particular at the protocol level, we can't enforce who
can and cannot participate.
So, but like from a, but that's not to say that, you know, at the corporate entity level that we are, you know, just willy nilly, just like going into every jurisdiction.
You know, when we're working on partnerships and especially, you know, big strategic deals,
public facing partnerships, of course, we're very, very aware of and take a lot of
caution and due diligence around understanding the local jurisdiction, regulatory environment
and things like that. But again, we at the, specifically at the protocol level, you know,
the protocol doesn't, you know, have like geofencing or anything baked directly into it.
Nice. Thank you for that. and you're up next eric
so um as i touched on earlier we we use um we use gpu and and and kind of ai at the edge for
for a couple of reasons one for um efficiency but two and most importantly for for privacy
this means that the video streams that we are using to capture the information that we need
never leave the device we don't see them we don't save them um and the model can be trained to
and the model can be trained to capture information it's trained to capture and
inversely not capture information it's trained not to or specifically told not
to capture certain information should it not be number one wanted or number two permitted. Now this means that the output from these devices
is really meta information. Now there are hundreds of parameters for every call it vehicle for
example. Everything from make model color, speed, direction of travel, road, how close it is to the car in front of it.
Does it have visible damage?
What kind of classification of a vehicle it is?
Does it have any logos on the side?
These are all kinds of parameters that we would capture.
But they're then converted into into text um now um our solution is fully compliant to be
rolled out into any jurisdiction so europe has regulations such as gdpr um cal California, which is where our first launch market was, has CCPA, which is kind
of California's flavor of it.
We anticipate there being further and further legislative changes in the landscape of data
data protection and privacy and with with having ai at the edge it really makes it um
the most future-proofed way to do things because um if you don't ever capture the data
you never have the data and therefore um you can um always fine- fine tune things to stop capturing it should it should it be no longer permissible.
But as it stands today, everything we're capturing is is is 100 percent compliant.
But we're very, very much future proof for any changes that that might occur, you know, in years to come.
Yeah, that actually, yeah, that's taken a lot of steps.
Thank you for clarifying that.
So I do, because I know we have about 23 minutes left.
So I want to transition this over to like one more question
before we start getting into the Q&A.
So one of my questions that I have,
so both of you guys, like, you know,
both of you guys' protocol does like have a token which have the obvious,
the obvious take up like incentive for like the miners as well as people even like providing the data.
So what would be some like lessons that you've learned when you like watch a token
that you think like, yeah, the biggest challenges are for anyone
who wants to create their own like AI project or protocol.
So yeah, I can take this, start with this.
So I mean, inherently, yes, a token, especially a tradable token, you know, is a,
um, it certainly is the incentive mechanism.
Now, it's not the only incentive.
It's not the sort of financial incentives,
being able to buy, sell and trade it on the open market
is not the only incentive.
So for example, with Flock,
our token is an ERC20 on the base network.
Now, in order to actually participate in AI training,
it to a token that's called GMFlock, which stands for GameFlock, which is a token that
gives you the right to actually participate in the AI training, participate in the protocol
and get additional rewards in the form of tokens. As far as what we've learned and what we're learning about the ways that tokens can be used,
I'm starting to think about more of like, in blockchain, we often talk about there's different products that are built on top of blockchain.
But this idea of the token itself being the product and everything around it is more of the feature of that product.
this idea of the token itself being the product
and everything around it is more of the feature
And so how I'm thinking about it
and how we're starting to think about
sort of more of our growth at Flock
is how can we actually give more utility
to the Flock token or to its counterpart GM Flock.
And so that might come in the form of
getting additional rewards from different projects
that we've partnered with.
But it also could come into play where it's like, if you have a certain
percentage of or certain number of GM flock, then you get exclusive access to some events or you get,
for example, we just struck a deal with Alibaba Cloud for discounted QN tokens,
as in AI tokens, not crypto tokens. So having a certain number of GM flock gives you the ability to get discounted access to compute.
And so it's thinking about how do we, again, I guess to summarize,
it's giving more utility to the token itself beyond just buy, selling, and trading it.
It's really sort of how we're starting to think about that.
And I mean, the only thing I would kind of add to that is,
you know, if you're thinking about starting a business in this space,
obviously, there's a lot of appeal to like start a business in a hyped up industry.
Often when you found out it's hyped up, it's kind of maybe sometimes too late.
You know, we've been building our platform for three years and three years ago chat gpt didn't exist and ai was not
a household name so um you know a lot has changed um over over that period of time um and
um you know i i think kind of forcing a square peg around a hole is not
necessarily always the right thing to do and then you know think about how you
really you know what why do you need your token it all sounds very cool to
have this tradable asset with potential liquidity but if you don't need it don't do it um we
we needed it because we have you know hundreds of thousands of participants in our network
uh already and we're not even you know we're still in testnet um so to settle and manage rewards to hundreds of thousands of people across the world is just impractical to do really any other way.
So it made sense, but there are many, many business models that can be hugely successful without necessarily having a token at the core of it.
um without necessarily having token a token at the core of it um now that's not to say that there
aren't ones that should have a token either but um just consider that when when looking at it
would be my only kind of personal piece of of knowledge or advice awesome awesome look we're
gonna we're gonna quickly end on this for you, what's one thing that makes you most excited about, let's say, the other space?
So, Matt, for you guys, on the D-Pen side, in particular, because you all have nodes that are coordinating and real-world devices that are coordinating, etc.
What makes you all excited about D-Pen's future? And then, Harry, for you guys, what makes you guys excited about D-A-I,
especially for your edge inference that you have going? What are some innovations you're excited
about? Speaking personally, what really excites me most about D-Pin is the ability to
earn, basically by doing, basically, set it and forget it.
So for example, I, for a long time,
I've been a driver on the demo network.
Demo is a network that collects vehicle data.
And then there's another project called Natix,
which is doing not the exact same thing,
but it's collecting, basically,
you put your mobile device up when you're driving in,
What I love about that, it's not to a point,
and it's very set it and forget it for me.
I am not earning enough to pay for my gas,
but I would love it a day where it's like,
just by connecting to these networks, opting in to share my data and collect data for these networks, you know, it's paying for things in the real world.
And also from a Deepin perspective, too, it's also putting on more of a flock hat.
You know, there is so much data that's being collected at Deepin or like in Deepin.
It's like, how can we use this data and then continue to
refine small models that can run on the edge?
Users are being rewarded not only for collecting that data,
but they're also contributing to high-quality AI models,
and then being rewarded tokens like Flock for contributing to those models as well.
So it becomes a multiplier effect.
Love it. Look, Harry, do you agree with that?
That there's a there's a path for earning through Deepin as a Deepin builder?
And then what excites you about DAI?
I mean, yeah, absolutely. I think I've said I've said before that my hope is, is that in the future, deep in is just the word fades away because it just becomes part of normal life. of them um whether that's you know energy or or connectivity and um it will it will subsidize
it will subsidize your life and make things um you know either more affordable or have
have kind of additional passive income so and yes it's happening today already there are people
making you know thousands of dollars every month by doing relatively little, and some making hundreds of thousands by doing a lot more than little.
So there's definitely large income streams, and, seven years ago, and was fascinated with that space. full circle I certainly think that that is a architecture and a methodology for businesses
to collaborate in a safe and in a safe way if that's the right way of describing it you
might have a better you might have a better terminology but I certainly think that the decentralized AI can democratize the space, can enable collaboration, particularly with federated learning for parties that otherwise wouldn't collaborate necessarily themselves because of the opposition of the underlying data. And therefore, there isn't any kind of business risk
in updating models to better each other
because no underlying data is being exposed.
So I think that these kinds of platforms like Flock
can create greater collaboration
and then kind of all ships rise.
Look, you heard it here first.
Go by 375 Street, you know.
But excited to have both ears.
So look, we'll open it up to Q&A here.
Go ahead and get your requests in.
I adjourn. We'll be approving Reese probably will as well.
We'll go ahead and approve some folks to get up and ask the question.
We'll get some requests. And I see one for quiz.
If you want to come up and ask a question.
Yeah. Welcome, Harry. One of the things that I've been paying
really close attention to is the rise of robotics.
We actually heard Elon not too long ago talk about, not that I'm some crazy Elon fan, but I'm just saying that it's probably the highest potential upside, the manufacturing of robotics, and then furthermore, the underlying component, the training of robotics, i.e. physical AI.
i.e. physical AI. Harry, what do you think about the prospect of something like 375 Edge capturing
all this real-world data to show or help train robots such that they can operate in the real
world rather than in simulated environments? We've seen many instances where robotics that
are trained on simulated data, once they're put out in the real world are either I'm not going to say
producing catastrophic results but maybe not performing to the level that they ought to
so that's my question what's the the impact that 375 AI might have with its real world data in
training robotics he doesn't want another irobot to happen well I mean you only have to you only have to look at the amount of failed uh rover
missions on mars uh to know that you you know you don't know until you get there um it's it's a great
point um you know robotics and autonomy or autonomous stuff be that vehicles drones etc um you know are today pretty much exclusively reliant on their own eyes and
ears um they take very little external information uh maybe they take weather and things like that
from the internet but other than that there's there's limited amounts of things they have beyond that of what they can see using their own sensors, which are often, you know, in most consumer products, very cheap, you know, pretty basic.
opportunity for any you know real world um sensors to augment the view of anything that is operating
autonomously or robotically within the real world um you know another business in the deep in space
geode net um is is providing positioning down to centimeter accuracy accuracy for robotics.
There's a business in the deep space also called WingBits,
which is doing air transportation information,
so independent tracking of flights.
Again, that's going to be useful for delivery drones that the skies fill with new aircraft that aren't able to talk to ATC in the way that a small aircraft or a commercial airplane does today, it kind of breaks the system very, very quickly. So
these things are absolutely necessary for robotics and autonomy to scale.
They're going to need more eyes and ears beyond that of their own.
You know, just a double click on that. You actually said something that was interesting.
This is actually for like UMS. So is there ever a future or could there be a future where 375 and block and team up and almost create like a car sensor or like a car plugin, like kind of like progressive and just like track like user data, like real time and it was still like preserve um user user information like who's driving but that would be something that's really interesting and i was just thinking about mass adoption that
can even help you know extend that mass adoption as well yeah i mean yeah i mean go ahead matt sorry
oh go ahead eric no no come on that's yeah so certainly a future you know especially as
Yeah, so certainly a future, especially as I could see a future in which flock models are running on 375 devices, so running on edge.
And those models are being trained and tuned as close to real time as we can get.
have a BitTensor subnet that we operate,
which is focused on compressing very, very large data sets
into smaller data sets that can be used
to train small language models.
So, you know, I can see a scenario in which, you know,
there's a collaboration angle there.
Oh yeah, because I would totally put it in my wife's car
just to track like how much reward she got, babe.
Thank you so much for sharing that, mate.
So are there any other questions from the audience?
If so, raise your hand or request to speak
and we'll toss you up on the stage.
I might jump in here. from the audience. If so, raise your hand or request to speak and we'll toss you up on the stage.
I might jump in here. My name is Reece, current research intern at Vested.
Amazing conversation so far today, gentlemen, this is great.
Just curious, as entrepreneurs, developers,
founders of these projects, paving the way
over personal control over models,
data. What are your predictions or prediction for the future of AI? Decentralized, centralized.
Do you think the ease of access to centralized models is going to keep or keep capturing this,
the market share for non crypto natives.
Where do you see this going?
Matt might have a better understanding of it,
but I mean, I think the, I mean, we use,
of course every business is using, you know,
some bits of different models.
And the fact that we as a small business can get access to something that's so powerful and then tune the dials is just incredible.
You know, we're doing things that just a matter of months or years ago was impossible.
And now we're doing it in our sleep so I think the fact that that is available just means that we there are businesses like ours that
can foster that and foster the innovation and build incredible things that would otherwise be impossible um so you know i'm a a huge fan of of it i think you know you
can't just you can't just stick a white label on the front of a of a of a can of white paint
and and say it's something new um you you have to do something to transform it and make it make it interesting um otherwise you you will be very
relevant very very quickly so i think it's what people then do with the the ingredients that
they're given but we've now been given all these great great uh pieces of produce and we're the chefs.
And if you're a great chef, you'll make a great meal.
Yeah, awesome. Awesome. I'm curious what Matt has to say about this.
Yeah, I think that when I first started coming into crypto
and then decentralized AI years later,
it was definitely one of more of like sort of like a, maybe more of like a cypherpunk
type of sensibility, which I certainly, you know, there's a lot of that in me still.
But in reality, it's like most like look at any, like, I guess, I thought like just
rhetorical question for everyone here.
It's like, what decentralized AI are you using today? My answer, my guess is that most people are
defaulting toward centralized AI solutions. And the reason being, quite frankly, in the
open market, like they're just better. So where I see it becoming, I see it becoming
more of like, it's a convergence point, where in, what are the strengths
that decentralized AI offers relative to decentralized AI?
And I truly do think that if people can have feature parity
between say like two chatbots or two chat apps,
one chat app is, you pay a hundred bucks a month for, you pay $10 a month for.
One, you don't pay anything for, or even better, you earn by contributing to it. That's a scenario
in which it becomes more decentralized, it becomes more compelling. Also, I think centralized AI
companies, what they are struggling with, and you sort of see this with the recent acquisition with,
I forget the name of the scale AI CEO,
but, you know, Mark Zuckerberg bringing him into Meta for like $100 million.
You know, they're a data labeling company,
so they're trying to solve the data problem.
And what decentralized AI does, it incentivizes,
it gives centralized AI companies, if they partner with decentralized AI projects, it is incentivized. It is centralized AI companies.
If they partner with decentralized AI projects,
I don't think it's necessarily going to be an either or.
I don't think we can draw that distinctive of a line
between centralized AI versus decentralized AI.
I think a lot of it is context dependent,
and really thinking critically,
and building in a very, very mindful way
and asking ourselves, you know, where is decentralized AI in the long run?
It's more, just better AI.
So that's how I think about it. Does that answer your question?
Yeah, yeah, 100%. Thank you so much. I appreciate you guys.
That's great. That's great.
We'll take one or two more questions if anyone has one.
If not, we can end on a final question. Let's great. That's great. We'll take one or two more questions if anyone has one. If not, we can end on a final question. Let's see. I'm curious if anyone has one. If not, I'll ask one in the interim here. challenges and, you know, good things you all have experienced when trying to explain your
product to Web2 users or Web2 customers, right? So for 3.75, trying to explain why use a deep end
tool to get this data versus traditional options. And for Flock, you know, when you're trying to
bring on AI researchers who are used to, you know, following the research of hyperscalers, etc.,
versus getting participative into a platform like Flock.
What are the opportunities and challenges you've seen in explaining
why a Web3 option is useful to Web2 people?
Yeah, so, like, I think explaining anything technical to when I want to think of like webtooth people, you know, of course, there are technical webtooth people.
So like my explanation is very much dependent upon the audience.
But I think about someone who is, you know, knows what ChatGPT is, but doesn't, you know, say like, know what they barely even know what a model is.
They just know that they're getting the outcomes that they want interacting with this.
The challenges are in terms of explanation is the degree to which to go deep.
But I think at the very basic level,
how I think about it is like, or how I like to approach it is like,
imagine having an AI that knows everything about you and all of your preferences,
and it follows you and you can use it throughout the rest of your life.
All the data that's being used to train that AI stays private.
So none of that data will ever be exposed.
Therefore, you're not risking losing privacy.
And oh, by the way, you can also earn rewards and get paid to contribute to it.
That's often what I land on as a starting point.
And another perfect example of ZK proofs, which would be perfect.
Yeah, but it's like the second you mentioned like ZK proof to like call it like a normie.
It's like all of a sudden that's where like the challenge is like, okay, to what extent do we get like technical?
Because I say ZK zero knowledge proofs.
It's like, oh boy, so now I got to explain this concept.
So it really just depends on the curiosity of the person I'm describing it to.
So we did get a question from Dash.
Dash, you want to hop on real quick?
Wait, wait, Dash, quickly.
Harry, did you have any thoughts on that question or we'll go in with Dash?
I got lost in what the question was.
So let's go to see what Dash has got to say.
Hi, guys. This is with the Flux.
Who's your customer base?
I mean, who's going to serve this Flux?
And plus the technical one for the privacy and control.
That is, you guys can provide the service for
public privacy or specific apps.
I just went through the documentation,
it's not much clear for me.
That's why I have asked that question.
Yes. The question is, who are our customers?
Correct. Yeah. The question is, who are our customers? Taddy. Yeah. We have our customers,
I would say the end participants are our customers.
These are the machine learning engineers
and the validators that are participating on the network,
that are fine-tuning and training the models,
but getting rewarded for doing so.
We're serving that customer base,
but we also have customers that we work with
where we build custom AI models.
So for example, we built a private,
we built a private model for a venture capitalist firm
basically they get overwhelmed with pitch decks
and information for products and trying to assess.
So we built a custom model for them.
So that was trained on our AI training platform.
And we did further fine tuning to build a full stack application for them.
And then we also work with partners as well.
So we're not selling into them directly.
Instead, we're partnering with them to go to market.
So right now, we have an active partnership.
We're also working with a project called Beacon Protocol,
which is for a protocol that's
basically aggregating a bunch of data.
It's doing so in a private way.
We're actually creating a model from
Demo user data that is being ingested to Beacon Protocol.
Then we're using that to create a model
that's basically going to help with
fuel efficiency and analytics for drivers.
So I guess in summary, it's yes,
thinking about the customer base is the users,
the participants who are training the models,
call it corporates, who are developing custom models for, and then more of like the partner
Harry, if you had something?
No, nothing. um here if you had something um no nothing i mean other than than i think what beacon are doing is pretty cool um i've also um been keeping an eye on them and um excited to see to see where that goes
um i think certainly finding a way to monetize and aggregate and fuse the data being captured by these distinct
deep in sensor projects makes the data more powerful so you got coming at this one plus
one equals three right if you can combine you know what that was talking about, Nadex, Demo, Hivemapper, 375AI,
those pieces of data are all individually valuable.
Combined, they could be 10 times more valuable.
So having a way for those to be aggregated and disseminated
makes total sense and doing it in a
doing it in such a manner that you're not necessarily exposing the underlying data to
each party but so these you know flock plus beacon protocol plus these data projects such as 375 ai
cool plus these data projects such as 375 ai demo hive mapper nadix geode net wing bits you name it
um there's there's something very very very um kind of groundbreaking and industry shifting there
Love it. Love it. And Jude, we'll end with you.
love it love it and jude will end with you
Yeah, G&GM guys, good to be here. Thank you, Josh, for having me.
Very powerful conversation today. This topic is very, very interesting, right?
Just a couple of, just one minute question, right? So I want to understand from the guest speakers, right?
Currently, we already have a kind of centralized AI
like CharGPT, Cloud AI, and all of that, right?
Now, we are witnessing the rise of decentralized AI.
So I wanna get from you guys.
So what does the future of AI hold?
Are you guys seeing in the next five to 10 years,
we're gonna have a decentralized AI
that is gonna overtake chat GPT?
Because at the moment chat GPT is centralized, right?
And you see what they are doing with people's data, right?
I think there's one of the disadvantage and maybe differences between centralized and decentralized ai so
do you guys think uh from the blockchain perspective right i'm going to have a legacy
a decentralized ai that will overtake centralized ai likeGPT and others. Yeah, so thank you so much.
My frank answer is that no.
I don't think that, especially for the application,
for the retail application users,
like, for example, with ChatGPT,
I do not think that there is going to be an application that has more market share than any centralized,
a decentralized application that has more market share than any centralized, a decentralized application that has more market share than any centralized AI.
That being said, I do think that as data becomes,
as data that has not already been used to train models,
it becomes more and more scarce,
that there will be more decentralized AI projects and decentralized AI models
that are either standing alone
and capturing a smaller segment of that market share,
and even that that segment very well,
I do believe that segment will grow,
or they're also contributing
to the development of centralized AI.
But to answer your question, just very frankly,
I think most users don't care
whether something's centralized or decentralized.
And in reality, you know, we don't have most decentralized projects.
I find it very, very, especially, you know, working across the ecosystem.
There's super, super high coordination costs relative to, and it becomes hard to like say
interface with jurisdictions or regulators in such a way that a centralized
company has the resources to do so. But I do think that decentralized AI will underpin and
really catalyze the growth of AI generally and even feed into centralized AI.
love it love it erie look we'll let you have the last word if you have anything
Love it. Love it. Harry, look, we'll let you have the last word if you have anything.
i am as a rule never going to predict what's going to happen in ai in the next five years
um because i don't think anyone can and anything i say will be almost certainly wrong
there you go. Okay. Mic drop.
Well, awesome. So guys, listen, was a fantastic conversation covering, like I said, at the beginning, two incredible projects in this space.
You guys check out their documents, et cetera. You all give it.
Let's give a quick outro for both of you on where folks can follow and keep up.
Here, we'll start with you.
So, yeah, 375 on X is 375AI underscore.
I'm on X, Harry underscore Duhurst.
You can check out our testnet and download our mobile app at testnet.375.ai.
We've got Discord. Check that out on our website,375.ai. We've got Discord.
Check that out on our website, 375.ai.
And yeah, it's been a pleasure being here.
Pleasure to talk to you and good to chat.
Yeah, so on X, you can follow us at flock underscore IO.
And then you can check out more on our website, flock.io.
And if you want to participate
whether you're a machine learning engineer,
a validator, or even someone
that does not have a technical
background, you can participate as a delegator.
And that's at train.flock.io.
Okay, guys. This has been awesome. Appreciate you train.flock.io. Love it. Love it. Okay, guys.