You ring true to the concept of acceleration as the national was fast.
Gotta move fast, build things, name things.
Pneumatic acceleration never sleeps, so here we are.
Yeah, well, thanks for proposing to do a space yesterday that was more, like, scheduled.
More impromptu at hours of the day and night that are maybe less accessible.
But, you know, hopefully we've got a bigger audience than usual today.
But yeah, I mean, I don't know. I think at this point, I somewhat feel like people are starting to hear the message or have heard it before.
I don't know if, you know, you think your audience has never heard of EAC or whatnot, or how do you want to format this?
But, you know, I'm open-minded.
Just have a little conversation or we can have, you know, a format with people just, you know, town square or something like that.
Like, you know, what do you have in mind with this space?
Kind of reflow, but for the moment, I would like to ask, how exactly did EAC start?
Let's go to the beginning.
So, obviously, this is your alt account. Your real name is Guillain-Verdon.
How exactly did BES, how did the BEF jailers get created?
Yeah, so my original account was based BEF.
And I, you know, lost access to the account for reasons we can get into.
It was too based on certain things that turned out to be true.
But, yeah, originally, I was at, I was basically at Google X working on some very secretive stuff.
And, you know, I felt pretty isolated, you know, living in Mountain View, as one does.
Not much going on down there.
It's basically your life belongs to the matrix, the Googleplex.
And, you know, because of the secrecy of what I was working on, you know, I wouldn't socialize as much even within the organization.
And I was like, man, I should, like, I need to feel a sense of, like, you know, community.
And, you know, started doing, like, Twitter spaces to just get to know people, get to know people in tech, find like-minded people, you know, through the internet, right?
Like, in a sense, like, the algorithm enables a sort of ideological clustering that's geographically non-local.
I thought that was, you know, it was an interesting way to discover like-minded people.
And so we really, it's just been like 3 a.m. Twitter spaces, kind of like this, but even less organized.
And we kind of like, you know, like late night campfire, campfire discussion, you know, the original way to propagate memes and ideas, right?
Like, you just compare notes at the end of the day after a hunt or something, except we're hunter-gatherers of code or something at the end of the day, regroup and chat.
And we really talk about, you know, I think this is like in 2021, you know, where are things going?
You know, things were, times were a bit dark, you know, we were like right in the middle of COVID, you know, year two, people are pretty down the dumps.
And people were pretty pessimistic about the future. Things were really chaotic.
And, you know, the outlook was bleak. People were pretty sad, depressed and so on.
And we felt like, you know what, maybe there needs to be, you know, there seems to be a sort of pervasive culture of negativity and pessimism about the future.
And it's kind of infected every organization, every media org, every people at the individual level, at the organizational level, at the almost national level as well, right?
Like there's just all sorts of weird pessimism and sort of, you know, despising strength of individuals, despising autonomy, radical individualism, you know, all this sort of stuff.
It was all sort of correlated. We were trying to pinpoint, like we couldn't quite put a pin in what it was.
But, you know, at the time we knew that we needed to fight whatever this was. And the best way we could think of fighting it was to engineer a culture, a viral sort of culture of optimism, where progress, technological progress, but also leveraging capital allocation cleverly towards progress and human capital.
And shaping culture itself, accelerating culture itself, basically accelerating everything towards progress as measured in terms of the scope and scale of civilization.
We thought we needed to engineer a mimetic sort of virus that would, right, like hyperstitially induce this sort of growth and progress. And so, you know, from first principles, really, I mean, I mean, personally, it actually comes, well, there's this sort of pre-EAC, for me personally, I mean, like, because I'm docs now, I actually haven't done that many interviews post-docs, you know, there's a couple podcasts.
But really, ever since I was like, seven or eight, I've been on a sort of life's mission, personally. And, you know, there was, you know, three phases to the master plan of my life.
Phase one was understand the laws of physics, right, understand the full stack of the universe.
Step two is identify the most potent technologies or actions I could take that would have maximum impact towards growth, civilization in scope and scale.
So step two, and then step three, it's, you know, working on those technologies, realizing them. So step one, understand physics, step two, identify an opportunity from physics, from first principles, step three, work on that.
And now I'm working on step three. And my time at Google X was like, step two, more or less, in grad school, and so on for me, last 10 years, more or less, was step two.
And before that, you know, I became a theoretical physicist, understand, you know, working on studying quantum cosmology, black hole physics, information theory, quantum computing, became a leader in quantum computing as the first user of one of the first commercial quantum computers,
spaghetti. I wrote the first paper on quantum neural networks, wrote the first paper on quantum machine perception, quantum analog digital conversion, a bunch of stuff, a bunch of subfield of quantum tech that I've cooked up.
But, you know, ultimately, I end up sort of finding two ways, actually, that it can have the maximum impact on our future.
One is working on technologies I'm working on at my company, Extropic, which I don't talk about much for all sorts of reasons.
But number two is, you know, having, you know, just like Elon realized, like, if you if you if you ignore culture, right, and you only work on the tech, you will have a sort of supply chain attack on memes, right, like, other people will shape culture.
And then that will influence which tech people buy, which tech people, you know, vote for, which one gets produced.
And so memes are upstream of capital and technology, right? And so you have to memetically engineer culture in order to have a better future we want.
So, so I took on this extra mission, you know, in a sense, by by kick starting the EAC movement, we had some Twitter spaces, we, we wrote down some notes at some point, there was a first blog post that was on my original substack a lot of access to.
And then there was a follow up blog post where we had some more principles of EAC. And that was around the time I was founding my company, I was in the mind space of sort of thermodynamics, right.
And so, and I use thermodynamics and self assembly for engineering.
And so I, you know, wanted to write a blog post sort of connecting those thoughts here just as a set of notes for myself of like how I could view civilization, similar to how I view it through a computer.
And, you know, end up writing that went somewhat viral.
And, you know, since then baselord and I have like, really pumped the gas on the memes, obviously, but then there's huge community that's grown, you know, exponentially since then.
And, you know, I mean, eventually I moved to moved to Silicon Valley, made all sorts of acquaintances and, you know, there's sort of flywheel effect of social capital.
now I get recognized in SF streets pretty often, which is kind of very, very weird.
And then, and then, and reaching the point of being a downside. But, but yeah, it's really escalated but so deep down it stemmed from like my personal mission of, you know, like, effectively what was EAC, right, like, or living my life according to what's the most
effective way to live my life and impact the world, the trajectory of civilization. But then it's like, okay, how do I scale that sort of mindset, so that everyone can have a similar mindset because it's very, very powerful, right?
Like, I've worked on some of the most hardcore technologies, push things forward in all sorts of ways.
And I think like if people were more optimistic about the future and were more intentional and agentic about how they, you know, live their lives, we'd have a higher likelihood of a much better future.
And so, you know, I think for me like one of the wake up calls was seeing sort of friends of friends that were in EA and adjacent sort of sub fields of, I don't know, ideology or philosophy, becoming sort of depressed about the future
and suicidal, some of them actually taking their lives. And I was like, okay, this is really, it's really bad set of ideologies. And obviously, there's other sort of D cell class ideologies that we talked about, right, the mind viruses that are more parasitic than the beneficial, right.
And, you know, our goal is to fight this sort of cultural fight in a scalable fashion using sort of algorithmic amplification, the viral, mimetic dynamics of X and other platforms.
I mean, we've been mostly focused on X at the time Twitter.
And, yeah, you know, to me it's like, it's like mimetic warfare for the betterment of humankind. And, you know, I've been a mimetic warlord in my breaks and at night and, you know, running a pretty, you know, we're doing pretty well there with with our company and, you know, the
technology there is gonna be really impactful. That's maybe a story for another space when we're more more in public. But yeah, I mean, that's basically where it came from a bit more lower than usual, actually, so exclusive to the space so far.
But, um, yeah, that's, that's kind of the origin story, you know, like, I think, like, I've seen how impactful sort of giving giving the gift or the permission of having agency can be for, you know, would be entrepreneurs.
You know, people that, you know, otherwise would be stuck in their, their jobs and maybe not, not as productive as they could be not as impactful as they could be being like, you know what, you can just go out there and, you know, do what you think would be most impactful in the technical capital machine will reward you.
It will allocate you capital for you to build this, right? And if you think you've identified something of high value, and, you know, I've coached a couple sort of young entrepreneurs this way, and it's had a very positive impact on their lives.
And, you know, I was thinking, how do we scale this? How do we scale this and having a sort of ideology or sort of cultural movement seemed like a really good, really good vehicle.
And so, you know, I hope, I really hope EAC just inspires people to build and inspires people to take risks to understand how the technical capital machine works, right?
People are just used to being assigned jobs and doing them. That's not how jobs work, you know, the purposes of jobs are just tasks in the technical capital machine, but you could be also the, you know, the programmer or the compiler, right?
There's not just the execution unit. And so making people more aware of how the system works and where it tends to go helps people figure out what to do and figure out what ways they can have maximum impact.
So, you know, also just increasing literacy about the self-adaptive system that is our economy and of ideas, memes, science, tech, capital, et cetera.
You know, the more people know about it and understand it, the less, you know, ideologues from other camps that are like, you know, oh, well, this is a complicated system.
Put me in charge and I'll make things better. You know, like if people understand how things can be better if you have less interventionism, then they'll always be biased towards, you know, giving authoritarians more power and more centralization of power.
I was just seeing that sort of runaway effect and I thought it was really important to have a sort of counter narrative for sort of freedom, for bottom-up self-organization for the system to adapt dynamically to varying conditions and to sort of have faith in that process and uphold the maintenance of that dynamic adaptation process of the technical capital machine.
Like uphold that malleability as a, you know, one of the core values that we're trying to maintain because that's how you have a healthy system that, you know, converges on the maximally beneficial cultures and technologies and ways of living our lives that, you know, will benefit us most and will engender maximal growth.
Anyways, just riffing here totally unprepared, but, you know, just the usual space here, but yeah, thanks for teeing it up. But yeah, there's people in the audience now. Hopefully that was a decent intro, Adrian. Do you want to say something and then choose, you want to manage the speakers?
Yeah, but before we move in on that, I would like to ask how exactly, so with a Befte as an account, you mentioned before that something was done to basically offline out of it. How exactly was this, like what exactly happened there?
Yeah, yeah. So I was trying to do high op sec and I was using like one of these like, you know, one time emails or something, you know, that you can generate. So I didn't have an email tied to my original account.
I really want to try to be in on. And essentially I just said, like, COVID came from a lab in some tweets. And to be clear, you know, I was told this in 2019 by like some really well connected sources before it became a political issue.
Right. And so December 2019. Right. And so, you know, I had some, I had a lot of confidence in the statement. And like, you know, I think at that time, the issue is still political or on the tail end of being political.
And, you know, I got booted off the account, basically locked out. And then because I don't or I think something was wrong with my email.
I couldn't get back into the account. And there's no amount of Twitter support that would get me back in even with my password and so on. And so I just got locked out in my appeal to, you know, because they would add me, they would ask me to log in and delete the delete the tweet about COVID.
And, you know, I couldn't, because I couldn't log in, because they kicked me out. And then it was this weird loop, just the total glitch. And it's funny because it happened a bit before the Twitter takeover.
And, you know, that, yeah, it was pre-Elon. It was pre-Elon. So thank goodness for Elon to have bought this platform, enabled us to, you know, discuss ideas that would otherwise be cancelable, would otherwise be out of the Overton window.
And that are important, like, you know, like, I think that the powers that be, right, that say, give us more centralized power and resources will ensure your safety. I mean, they massively fucked up already, right? Like, with, with, with biosafety research, like, they fucking created COVID by accident, right?
It wasn't, it's probably not by, on purpose, but like, they're the ones that cause this pain. And here they are, trying to do it again with like, AI safety and say, oh, yeah, give us the monopoly on like, AI, gain a function research, and we'll keep you safe.
And, you know, the fact that they're also, you know, using algorithms to suppress, you know, voices of people trying to keep them in check and call them out is like, it's pretty dystopian. And I mean, that could totally happen with AI, right?
Like, it's like, you know, if the centralized entities have somewhat of a monopoly over AI, they could use it to psychologically engineer people to like, just reinforce their, their power. And it's just like a self-reinforcing loop of like a bigger, bigger AI mode as more and more people give away their power to the centralized entity.
And that's, that's a really dark scenario, and we're trying to avoid it. And that's why, like, with EAC, we argue for, you know, decentralized compute, open source software, well, decentralized compute, oh, you know, ubiquitous access to compute, no, no sort of like, register your GPU, like, it's a gun kind of deal, that would be bad.
And then no, no sort of hardware unlock backdoor thingies for the government, that would be really awful. And, you know, ability to share open source models super important. Because, you know, ultimately, the future looks like the government and centralized entities have very powerful AI that they use to steer and control you culturally, psychologically, etc.
But if you don't augment yourself with your personal AI that they don't control, you're not going to be able to, to counter this sort of this attempt at steering you. And so I think it's really important that we maintain a power gradient in terms of AI capabilities and compute capabilities.
That is within a certain range, it's reasonable, so that you don't have sort of AI, you know, AI assisted, or AI enabled tyranny. And so anyways, kind of took it in a certain direction, right, like, you know, getting getting getting banned and censored for, for saying that the adults in the room aren't really there.
You know, kind of kind of foreshadowing for what's to come with, you know, if we if we give up our rights, give up our power to these AI safety is that proclaimed, you know, they're the experts who know better than everyone, and, you know, are going to keep us safe, like, you know, I smell bullshit, and I call bullshit on that, because, you know, that's the thing.
You know, I've been, I'm not not necessarily like a AI, but you know, in quantum tech, you know, I was the expert that some very high powered people would refer to for advice and guidance, you know, and I was the expert that whose advice would guide policy.
And, or I was one of the experts, you know, so I was one of the adults in the room.
You know, let me tell you, like, there shouldn't be this sort of gatekeeping right like if you're smart enough, you could put together coherent argument, you know, enough technicals, you make an argument that could influence policy so it shouldn't be restricted to
only those that have written like these extensive pseudo philosophical pieces about AI safety. You know, it can't be just those word cells that have all the power in the future. Of course, they'd, they'd like that, you know, it's in their interest to legislate the need for every company to have an AI safety department.
It's literally a big AI safety trying to sell you more AI safety. But yeah, I know it's a tangential rant over.
Yeah, it's an interesting thing about AI safety for me is initially how I got into AI is actually through AI safety itself. I found the concept quite interesting, actually, as opposed to like the way I viewed it was not a means to basically stop the development of AI, but it's not a means to stop the development of AI.
Like the way I viewed it was not a means to basically stop the development AI, but to do so in a more conscious manner. I looked at it and I said, hey, this is really interesting. Yeah, paperclip scenarios and whatever else, right, it was actually fun from the perspective of, you know, looking at what could go wrong, and then how to solve for whatever could go wrong, as opposed to just stopping, like to look at it to use AI safety as a religion to basically stop AI development kind of seems counterintuitive to the idea of what AI safety should have been all along.
Because I personally find it could be fun to think about how you could apply it in a more realistic sense where it says, okay, we have this potential for things to go wrong. Let's see how we can solve for those things potentially going wrong, as opposed to just stopping development in general, which does essentially I think it also solves the problem.
But then by that same token, if you really looked at AI safety, that's a misalignment issue. If you say, hey, stopping everything also solves the problem, but that's not the solution that you want, much like, hey, the machine will turn the whole world into paperclips. Yeah, sure, it creates paperclips, but that's not necessarily the way you want to solve that problem of creating paperclips, right?
Yeah, I mean, like, you know, how do we know the machine is not doesn't have like epsilon likelihood of doing some, you know, some hypothetical thing? It's like, well, we don't know. Okay, well, let's shut it down until we know. It's like, well, you'll never know.
Because like, it's a really complicated system that you can barely understand, we can barely prove why current systems work or don't work. But, you know, people don't understand our brains, right? We can't reverse engineer our brains. But we found ways to align humans. We found ways to align entities that have more, you know, beyond human intelligence, right?
Corporations, right? Through economic exchange and exchange of capital, there's also mechanisms for game theoretical alignment there between these, you know, mega corporations or even medium corporations that, you know, are neural mixtures of experts, a bunch of experts, right? And a bunch of workers, and you have the neural router. So it's basically a super brain. And it's definitely smarter than anything else.
And it's definitely smarter than any one individual, I don't know about you, but I can't design an Apple Vision Pro alone, right? Like, I don't think I could, I could do that to many people, right? And so, so we're, we already have ways to align ourselves between each other in a sort of peer to peer fashion. And so we don't need to sort of defer power to a big centralized entity that will like, impose order in a top down fashion, because who keeps
that, that organization in check, right? From, from, you know, imposing its will in a nefarious fashion on everyone. And, you know, that's the question, they never, they never have an answer for it, right? They never have an answer for it. It's just, oh, well, the problem with the problem with over concentration of power, right? Is that I'm not the one in power, me and my, put me and my friends in charge. And we'll fix it, you know, the problem with current day elites is that they're not my elites, you know,
and I've heard this from like different camps, you know, there's like the authoritarian right and authoritarian left, right? The authoritarian left is like the EA's, you know, elite types and, and, and, you know, they think if they were put in charge, they'd manage things better. And then you have the authoritarian right, like the NRX types, you know,
and, and, and, you know, we're out here, like more leading more libertarian, obviously, trying to, trying to bring back balance. You know, it's like, hey, have some humility.
Actually, the system's already self organizing, it's probably much better at, you know, finding ways to allocate capital and resources than you, and finding ways to focus or allocate capital towards technologies that have positive growth towards the growth of the whole machine, rather than focusing on technologies that have, you know, massive negative impact.
And in a sense, it's funny, because the free market wouldn't like search for, let's say, you know, COVID viruses or whatever. Naturally, it's not something, you know, has a lot of upside, it's mostly like, in the neighborhood of things you could do there, it's all downside, whatever variant you create is like, it's probably fucking bad.
But, but the safety research is actually searching in the space of, of nefarious technologies. And I think that that search and that fucking around is actually more dangerous than the free market. I think, I think AI safety research is more dangerous than the free market itself, what it would naturally converge in terms of the subspace of technology is it, it would, it would aim to create, right?
Because again, like, if you had an AI that was unaligned and, and highly nefarious, you know, you probably wouldn't, the company wouldn't allocate it GPUs, consumers wouldn't give it money, enterprise wouldn't trust it in their systems, and it would, quote unquote, die in the sense that it wouldn't have, you know, that software that neural net wouldn't have capital or GPUs to run on, right?
And so it would be dead, in a sense. And so there's a, there's a sort of selection pressure induced by the market on the space of the eyes towards, you know, eyes that are aligned and positive for us, you know, similar to how, you know, make this analogy, people laugh at it, but it's like, how we went from wolves to dogs, right? Dogs are very much human aligned, and they're very easy to read, they're very interpretable, right? They're very motive, very easy to read for humans with their thoughts.
What they're thinking, you know, to some, to a large extent, and they're far less dangerous than than wolves, right? But we could have steered evolution. It's like, what if we made a super duper wolf, like, that's like super jacked and werewolf and like, like, and then if you engineered that, and then, you know, escape from the lamp, and boom, maybe we'd have a crazy werewolves.
Probably about and they'd be pretty dangerous, right? But no, instead, we have like, you know, fucking chihuahua, right, or whatever, like, so, so, so what I'm saying is, like, naturally, if you if you couple these technologies to, you know, the homo techno capital machine, right, human couple technology, humans, right, human consumer markets, and so on, technologies become sort of aligned, right?
We post select on a space of software for like, good user interfaces that are user friendly, etc, interpretable, readable. And so, you know, I think you should just let the free market do its thing. Like, in principle, I'm not against like AI reliability and engineering, which is the generalization of AI safety or whatever.
But I'm against these sort of non technical pseudo philosophers that are like, more or less like think tanks, paid for by the AI incumbents to achieve regulatory capture, right? So, yeah, we have john, what john, what's up?
Hey, Matt. Yeah, I just wanted to say I really liked the way that Adrian just kind of flipped the paperclip mean back on the safety lists. Because the whole point of the paperclip mean is always that this hypothetical AI is so single mindedly focused on paperclips that it somehow misses the obvious implicit requirement that all of us see that human life is so important that you have to consider that that you you can't possibly sacrifice that. And that's the ironic thing.
And that's the ironic part is that the safety lists are so single mindedly focused on safety, that they miss the obvious implicit thing to all of us that the benefit of actually advancing AI and integrating that over the future is so important that it's worth doing almost anything other than slowing down or stopping to make it safe.
And it's really interesting that that was like already kind of obvious to Adrian from the get go. But a lot of these safety is who stay at it for years, they just don't see that.
Yeah, precisely. I mean, I like to find enjoyment out of cautiously evolving, as opposed to it's riskily going out there. I mean, there's obviously a balance that needs to be struck. But again, you know, you have to look at irony, and see how exactly that is a factor, right? You're not to just like, I mean, if you were to just continue going ahead with the current path of how safety is approach AI, that basically it's like cloud farming, you don't want to do cloud farming, because that doesn't lead anywhere. What do you want to do is farm cognitive
skills, our minds ability to draw conclusions, right? So that's what I always thought of when I read this stuff was like, this is interesting. But I mean, why are we doing it? Kind of in the same way that the AI is doing it, right? We're kind of becoming a fallacy of a fallacy, because we're looking at this thing that's wrong. And we're becoming that same thing, based on the principles that are trying to prevent it. It's like, whoa, you know, it's irony, right? You have to notice irony to understand that.
Well, I'm so sorry, I didn't mean to throw my hand up there. No, not yet. I don't, but I'm thoroughly enjoying the conversation. I'm sure I'll have some input a little bit later.
Cool, cool. And Beth, what do you think there is this, there's this interesting series out there, obviously, you know, for compute AI and everything else. Life imitates art, and art also has a very big impact on life as a result. So with something like the matrix, the prelude, the prelude to matrix itself, or even afterwards, was the animatrix, right?
We had these different episodes, which basically showed the tale of not just what happened in the matrix movie, where AI took over and destroyed everything, essentially, but also, it showed us what the early days of the takeover was like, which I'm pretty sure had instilled a few fears.
Do you think that, say, we should create somewhat more optimistic art, as opposed to say something that is very dystopian and terrible, such as, say, the matrix where everything went wrong?
Yeah, certainly. I mean, you know, as I said, memes are upstream of, you know, which culture becomes pervasive, and that steers sort of, which products get created, which products get purchased, that steers the space of capital and technology.
You know, memetic, techno capital coupling. So it's really important to have optimistic art, optimistic art about the future. It's funny that we have our little corner of X, you know, that that Elon tends to interface with, you know, where we produce our own sort of optimistic art about the future, painting visions about the future that are like cool cyberpunk and hopeful.
And, you know, like, I think we need that, but like, you know, major production value. But unfortunately, right now, sort of, the big media productions are ideologically captured by D cells, right, by like incumbent elites, incumbent institutions, because that's where they get their funding.
And so they're gonna, you know, try to subvert their audiences, you know, to steer them towards the ideology that, that, you know, their funders want the audiences to be steered towards.
And so, yeah, I think this is why like, you know, Elon buying, buying X was like a turning point, we can like, have our own media, share own media. I don't know about you, but like, I barely, you know, or like I've been mostly consuming X content and when a video is not on X, like, I end up not necessarily watching it or something.
You know, this platform has everything you need, to some extent, it is somewhat of a bubble. But like, you know, if it keeps scaling, you know, a billion or two people, which I certainly hope it does, like, you know, you won't need to get off this platform to have the content.
And so, but yeah, I do think like, if people want to make me companies or shows or podcasts that are techno optimistic or optimistic about the future, I think that's super important. And it's, it's like how we, how we scale, right, right, because like, you know, memes and so on on X is fine, but not everybody's on X, but some people watch TV, they watch YouTube, they listen to podcasts, listen to Spotify, all sorts of platforms.
And so we got to, we got to spread beyond these walls. And so it's super important. Yeah.
Psyche. Nice name. You got your hand up?
Yeah, so I wanted to ask you, Beth, how do you feel the specific norms of physics thinking have influenced your intellectual development, right? And I'm, I'm contrasting that mostly with mathematics thinking, right?
So in mathematics, you're reasoning from a fixed set of axioms, right? And different mathematicians might be working with different systems, but it's an inherently sort of closed process.
Whereas in physics thinking, there's much more of a willingness to go with what works, right? So that the standard is not as much did you follow all the rules so much as is what you produced useful.
And I just love to hear you talk about that a little bit. I was a double math and physics major, and this contrast was very apparent to me and shaped me and shaped a lot of my development.
And I'm just curious if it was similar for you.
Yeah, you know, that's a good question. I was also math, physics, and undergrad, and then I did like theoretical physics and then apply math, computer science, quantum stuff.
So I understand exactly what you're talking about. Like, essentially, sort of mathematicians are, are kind of like the proof version of word cells.
It's got to compile perfectly the proof cells, right? Like, it's like, Oh, can you prove it? And it's like, you know, and they're obsessed and they're so obsessed with like, being like it compiling perfectly correct, right?
It's like, it's like, it's like classical instead of differentiable programming. It's like, it's got to compile the logic that they'd rather make fucked up assumptions and be internally consistent in their inference, then, then and be biased away from reality, then like not necessarily or not being able to like precisely pin down everything and do a quick inference.
And so this, this happened a lot in like quantum computing, actually, and I kind of so, with some background, like I really pushed, you know, differentiable programming for quantum computing, basically started that field, you know, differentiable programs like TensorFlow or deep learning.
And that sort of mindset, bringing it to quantum computing, there was a lot of sort of classical complexity theorists, mathematicians that were saying, Oh, no, you got to prove speedups, real algorithms, blah, blah, blah.
And, you know, it's like, well, in deep learning, we barely understand why these things work, we just understand, like, high dimensional optimization is really powerful.
And, you know, you create software frameworks that that leverage that sort of differentiable optimization and these approximate optimization heuristics rather than like, you know, if you try to prove why deep learning works, you show it doesn't work, technically, you're like, oh, it's gonna be exponentially hard to find a global optimum.
It's like, well, but in practice, there's tons of local optima, and those are good enough. Right. And it fucking works, right. And so, and I was just trying to make quantum computing work.
So, okay, we can do very short programs that are parameterized. What if we searched over the space of those programs, using gradient based algorithms and create software for that and algorithms for that, that's probably gonna work well.
And that basically ate the field after, after we created, you know, after I wrote the first papers in the space and then created TensorFlow quantum and so on.
And so I fought this fight before and in a sense, like, EA's and rationalists right by the very name, like, remind me of that, that they'd rather have like these crazy sci fi based priors.
And it's like, oh, well, look, these are my priors that are totally fucking inaccurate. Look, my logic is not flawed, like, find a proof, like, find a flaw in my logic.
And it's like, your prior is anchored from reality. And for me to explain it to you, you actually have to learn physics and try to anchor to reality, and maybe unsharpen your prior.
And it's really hard. That's like how, like, that's what most arguments break down to with with with sort of EA's and rationalists.
You know, as a physicist, it's like you could have a nice theory and so on. But if it doesn't agree with reality, it's basically trash, right?
And so you got to, you got to like, be flexible and work with approximations and, and be comfortable with those because, you know, there's there's a lot of that you don't, you can't even characterize or write down succinctly about the universe.
And you got to have some humility that you can you can, if you're gonna do an analytic model, you're gonna that you understand by hand with your own human brain, then you're gonna have to make it very simple.
Right, and make it an approximation. And if you want to tackle complexity of the world, actually, the modern way of doing things is using using machine learning right tackling complexity and complex with complexity, complexity and bottom up emergence of in physics, right?
Like there's there's renormalization emergence of all sorts of properties at different scales.
You can you can mimic that in deep learning by using machine learning to understand the physics of the world, right? And that's like physics based AI.
That's some of the algorithms we work on. I worked on my career and still work on some extent.
And, and that's a different way of thinking, because I went through, you know, I was like, growing up, I was like, okay, well, you know, I'm gonna be a theoretical physicist, I'm gonna write down a couple equations.
That's all quantum gravity. And then we'll figure out interstellar propulsion. Great. But then you figure out that, you know, there's so much of physics that, you know, like, the lens with which your aperture, with which you can look at the world, if you're restricted to analytic physics is so restricted.
Whereas computational science is much, much broader. I think there's someone.
What are you doing, Adrian?
There's some noise going on.
I was invited. Thank you, guys. This is amazing. By the way, thank you, Beth and everybody. I'm gonna go back on mute.
Oh, no worries, no worries. Yeah, I was saying, like, there's a point in my career, and that's kind of like, where I had, well, not an existential crisis, but like, I was like, okay, I don't think humans can understand the universe around us with a couple equations.
Like, this reductive approach to understanding the world to like a few simple rules and laws is not gonna work, right? And we've had like, I don't know, 40, 50 years to figure out theoretical physics, and we haven't, right?
Like, so I've been through these sort of like, nerd traps of like, if only we think harder and have the right model, we're gonna solve everything with just a couple axioms.
And it's like, it's never worked. We just got stalled. And it attracted all the best minds in the world, very, you know, string theory and idiot CFT and all that stuff, you know, the stuff that attracted me, like, it's like a nerd trap, right?
It's like an autistic flat trap. It attracts all the best minds in the world, and they waste many years. I mean, they learned some stuff, but they waste many years chasing this, this dream that will never happen.
Whereas like, if you, if you're a realist, and you have some humility, and you have sort of looser priors about how the world, or you have priors that are more flexible, and you make sure that your only anchors the laws of physics, then, then, then you can make
some interesting inferences about how the world works. And you have sort of more lucid view, rather than being biased towards the view of like, what you what can fit in a perfectly sort of logical inference that you can like verbally, that you can verbalize efficiently,
right? There's many models that are hard to verbalize, but that are accurate, right? More accurate. And so I think we suffer from sort of a bias towards models that can be
easily explained, models of the world that can be easily explained. And like, the thing with EAC is like, we're, we're pushing models that are kind of like, it's very similar sort of differentiable programming, what differentiable programming is to programming
what, what like, experiment or computational physics is to, to math and theoretical physics, right? It's like, it's like, self organization, instead of pop down prescribed ways of understanding the world, right?
I think, Adrian, you want to say something?
No, no, no, go ahead. I'm not, I'm not, I'm not, this is Mike.
Yeah, yeah, yeah. Gotta manage the hot mics. Um, but uh, but yeah, just, just give me a second here. But if anyone wants to jump in here, it's a, it's a good point for me to hand off maybe.
I think Scott may have something.
Yeah, sure, sure. Um, hello, Beth. We have not met before. It's nice to meet you. And as you know, um, one of the rules to be a speaker up here in Adrian spaces, not only to follow Adrian, but to follow the other speakers.
And so I need to take a dose of my own medicine and, uh, I cannot believe it embarrassingly. I have not followed you yet, but I would like this to take the moment because you will allow me to actually close a circle.
Uh, or let's say become the other bookend because I don't know why, but about nine months ago, Jeff Bezos actually followed me.
So if you would do me the pleasure and follow me back, I would be able to say both of you in my quiver. I don't know why I am still convinced to this day. It was a butt follow. It was completely accidental.
I think some of the, the media posted about blue origin. Maybe he would not have followed me, but, uh, uh, anyways, yes, it's great to be in the space. I don't have any particular questions yet, but I know you've been doing a lot of the talking.
I will try to take some of the heavy lifting off of you and allow you just to, you know, answer some interesting questions here and there. If anyone else wants to jump in again, the rules are, if you want to be in the speaker board, you must be verified.
You must follow the hosts and Adrian will give you an anal probe just to make sure you really are a human being. And finally, let's see, are there any, oh, you know what? And this is in our previous space.
We were talking about whether FSD is going to be on the Tesla's and it looks like it, you might be able to transfer it right now. So just from the back thing there, I'm checking into that whole Mars just made it sound like you actually can.
So, uh, not to go too much off topic.
I think psyche is a second Scott's. Okay, go ahead. Oh yeah. I just wanted to sort of put a punctuation mark on all of the very interesting things best set, which is that I think the, the sort of duality between mathematician thinking and physicist thinking, mirrors, or is an example of even the, the.
Duality between foundationalist thinking where you assume there's some basic set of completely unchallengeable axioms from which all knowledge grows like a crystal versus a more pop area and model.
A more evolutionary model, right? A more organic model rather than a crystalline model where knowledge is the result of evolutionary processes and you never have a perfectly firm foundation, right?
You will never arrive at principles that you can be literally 100% certain will never be challenged because that's a completely unrealistic goal.
And this, this, um, this embrace of fundamental uncertainty, this embrace of organic this or this embrace of emergent processes rather than a sort of perfect logical outgrowth of something that you assume erroneously to be certain is something that is borderline spiritual.
And I think really defines this difference in culture between these two communities.
Yes, yes, but you know, emergence is super powerful, right? Like in self-organization is how all of like nature self-organize and create everything we know today, right?
And including the brains we use to generate the speech, right? And it's fucking amazing process, right? Like, for example, like in physics, we have a lot of difficulty understanding phase transitions and explaining them with math or words, right?
But we just know it's like, well, matter in this region of parameter space, the vibe is different. And, and, and, you know, people have resorted to using like machine learning to identify phases of matter and so on.
So I think like, again, like once you let go, once you sort of let go of like, let's say the imperative programming software 1.0 model, you get to software 2.0 as differentiable programming and gradient descent is a much better programmer than you, right?
And, and, and, and in a sense, for that same reason, you know, self-organizing systems are better policymakers than you, right? Period.
There's no, nobody's going to have the perception ability to predict and control the system at all scales.
And A-B test, all sorts of hypotheses in a top-down fashion just doesn't make sense.
It's got to be a fractal or not fractable, like hierarchical self-organizing hierarchy of feedback loops. And we have that within a corporation, right?
We have corporate hierarchies, and then we have like ownership hierarchies, and then you have, you know, states and nations and so on.
Everything needs to be hierarchical. And, and the level of power and ability to, to enact fine grain control should be sort of proportional to the level of knowledge that, that certain level of the hierarchy has about the future.
And in general, the, the, the higher you go, the broader the system you're trying to predict, the, the fewer statistics you can even predict, right?
You can maybe have a couple metrics for how the economy is going to go and have a rough model that you have some uncertainty about.
You have a few bits of future of mutual information with the future.
And so you adjust one or, you know, a couple of hyper parameters manually, and that's all you can really hope for in terms of optimal controllability.
But as you go, you know, further and further down the stack, it's like, you can have many bits of optimal control of like deciding what you do.
If you have full visibility, let's say you're an engineer or a coder, you know, individual contributor at the bottom of the stack, then you have a very tight feedback loop where you're iterating on your code, you know, on a few second time scale, right?
And so you have the sort of hierarchy of like tight feedback loops.
There's like a lot of IO, you know, back and forth with reality.
And then as you go higher and higher up the hierarchy, it's kind of longer time scale, fewer and fewer bits that you adjust.
But that's how you keep sort of a system sort of optimally steering itself towards a better future.
And, you know, it's not it's not by giving more money into philosophy departments or EA organizations that are going to write big papers that are unintelligible or that are that we're going to figure out how to run everything in the world.
But they do have a lot of hubris, like thinking that if only they were the ones in charge, like everything would run perfectly, right?
And clearly, they never try to run a business or build anything like because, you know, that will humble you, right?
You can't even if you're a fucking genius, you have to, you know, even if you have very good priors, you're super smart, the market will humble you, right?
Like, ultimately, you got to build stuff that serves the market. And that's why, you know, like, different schools of thought, you know, you have Steve Jobs School of Thought, like, you should have a very sharp prior vision of the future.
And then like, sort of deform reality to your will, right? Like, make some make people want the thing you built. But, you know, YC is more like make something people want.
You have no prior of what they want, you can, like, just iterate and have a very fast optimization loop and converge on something the market wants.
And then you have a very successful company. To me, it's like how much you weigh your prior versus your data, and just your beliefs in an online fashion.
And in a sense, like the doomers want us to have like a very clear exact plan of exactly how we're going to optimally control everything down to T.
And it's like, no, you need an open control loop. You can't plan all of this in advance. And the only thing you can hope for is having that control loop be fast enough and smart enough.
And that's how to achieve that. Well, that's that's the thesis is by having much more variance in the system, right? And the parameters everything of culture, technologies, companies, everything, everything has to have some variance because, you know, everything's sort of an evolutionary,
evolutionary like algorithm and the speed at which the algorithm can can can traverse landscape and find the optimal sort of, you know, bounded by by the variance of our tests.
And it makes sense, right? Like, for example, that's why we argue for open source AI and open source AI, you know, you have like a model and then people mutate the model and examine all sorts of ideas that are in the local neighborhood.
And then sometimes there's like an evolutionary branch of an old model, right? You can have like a mamba or a different type of model. And then suddenly everyone aggregates around that and there's a bunch of mutations.
And it's very like, it's very lifelike adaptive algorithm. And that allows the community to, you know, on a long time scale, I think, outpace the centralized labs, just because the really what you're paying for in a corporation is like, you know, the research and development
is like a long optimization algorithm over the space of technologies. And if you have that optimization algorithm be more sort of open and collaborative, then, you know, it can converge much faster, or rather, it can actually maybe not convert.
Well, okay, so centralized orgs go much faster for things that are evident, right? If things like, okay, we just got to go in this direction faster.
But what sort of decentralized approaches do is find new modes of thinking, because they'll explore many new modes, it's like a much higher temperature optimizer.
And so both sort of are important to some extent, and in a sense, we can have both, but like we shouldn't outlaw open source, for example, in order to save like the big centralized orgs like margins, right?
And so that's something we argue for. Okay, speaking for a while.
That makes perfect sense to me. And you know, if I could have doomers read any author, it would be Hayek, right? Because I think there's there's an arrogance here, or maybe just a lack of systems thinking or a lack of abstraction, where they assume that if we are not following their grand plan, we're following someone else's grand plan, and they know they're smarter than most people, at least.
So they're like, Oh, you should follow my grand plan instead. But they don't, they fundamentally don't understand that what we are saying is that we do not have the ability to plan these things.
And in fact, are merely neurons in a larger entity, right? The entity of techno capital that is planning these things. And that entity is smarter than us, right? Imagine if one of your neurons got anxious and decided that it had to capture your entire brain, lest your brain one day,
decide that neuron was no longer beneficial, right? Like, there's something fundamentally broken here, where there's a refusal to accept higher level organization than the individual who is making this judgment.
Well said. Very well said.
I like to kind of, I like to kind of point back on the argumentation of algorithms.
Largely, I think you should be the distribution he was just seeing the algorithm, because it itself is like a tool. And through that, you can push practically anything. Anything can work and anything can also not work.
It just depends on how distribution exactly works. I think x is really good for exactly this. You can achieve a lot of things on x. And I think you could definitely find the people of value to you.
A lot of times we seem to conflate numbers with actual weight. I think personally, if you attract people with very heavy weight in the market, it is a lot more valuable than just numbers on their own.
I think numbers itself, in terms of volume, become inherently meaningless. Of course, you consider the actual weight of the individual numbers themselves, right? Because otherwise, if it were just volume of numbers that would count, then bots themselves would actually be beneficial to you in your reach.
When in reality, they are not, because they put you, they distribute you into a part of the network, the algorithm that is essentially a dead zone, because the bots themselves aren't real and they don't see anything. But yeah.
Hi, Beth. I had a quick question. So I was watching the interview between or the debate on YouTube today. And firstly, I wanted to say you were so incredibly patient. So one second.
Sorry, yeah, you were so incredibly patient with that debate. I have no idea how you did it. But also, there was a point that was kind of brought up at some point where you were talking about how capitalism is sort of like, it will drive accelerationism, which is obviously like a principle, like Tenet of YAC.
And something that was kind of raised by Connor was basically, he was hinting at the idea of how it affects other countries. And I kind of like started thinking more about what the exact steps would be for like, say, like a philosophy
that, let's say, proliferates in a capitalistic economy. What would be the exact steps for that to proliferate into economies that are not capitalistic right now? And like, do you think that there is sort of like any sort of like ways in which YAC within America could, by nature, help the rest of the world?
Yeah, when you give examples of non capitalistic economies out there, like, I'm just trying to understand. I mean, most countries leverage capitalism at this point, to some extent, right, even China.
Right. I guess, like, my first question, in that case, I mean, like, there's countries like Venezuela, who are just like, the point is that I feel that, like, when you say capitalism in YAC, I think of something more perfectly capitalistic, like America.
And so yeah, I was wondering if, firstly, like, you could explain what you mean by capitalist and that's, well, yeah.
I mean, capitalism is, it's a fucking AI algorithm, right, for us to organize ourselves, right. It's kind of like a sort of heavean learning system similar to neurons, right, like it reinforces links and reinforces, you know, like, you could think of companies as neurons
and like, cash is sort of firing patterns between them and companies that do business together tend to increase how much they do business together, what fires together, what fires together and, you know, neurons that get very high utility and have a lot of connections
tend to grow or fork off and there need to be redundant pathways or there's competitors that emerge.
And so basically, it's a, it's a technology, capitalism itself is a technology, right, like the idea of capitalism and the legal system surrounds it.
It's a technology for us to have sort of to to advertise costs of producing things to have fractional ownership in future successes of technological endeavors, and, you know, similar to how, you know, eukaryotic cells have, you know, I mean, like,
they have fractional ownership over the success that of the organism, or the organism they're part of, right, they're kind of like genetics are like, you know, it's like a cap table, in a sense, right.
And so, and that gives a huge advantage right over say bacteria that are, I think, prokaryotic, you're the biologist, but, but yeah, like, to me it's like as fundamental as like, yeah, going from prokaryotic to eukaryotic cells, and I think that just increasing people's literacy on how capitalism works
and why it's a force for good so that politicians can stop saying, you know, stop blaming capitalism for their shitty management, or like their corruption, right, like they'll, they'll de-sell the shit over regulate the shit, grease their paws out of over regulate the shit of several industries grease
their paws and then blame monopolies or some of their capitalism is like, well, if you didn't let them achieve regulatory capture, right, like, there wouldn't be a problem, they could be disrupted, but you did that it wasn't capitalism and it's funny how like growing up in the West, like,
we, we're kind of psy-op to hate capitalism, or thinks think it's a big bad force, and it's kind of like, you know, we're kind of fighting against that the psy-op is like it's a very powerful creative force, right, like, and it's a force for immense good.
And, you know, I think it's very fundamental to, to humanity. It's one of the most potent things we've invented. And again, it's totally, totally virtual. It's like a fucking software technology that runs on our brains, right, it's, I mean, it's like, it's legal system, right, and then, and then capitalism is just fucking data, right, and it's all, it's all software.
It's like how to, it's like a capital allocation algorithm, and we all like, respect it and try to uphold it. And like, the point is, you try to not break it, right? Like, if you start, start fucking with regulations, you're fucking with how you compile this optimization algorithm, so there's gonna be bugs, and it could crash, right?
And it could crash, right? And that's why it's like, don't fuck with it, right? Stop messing with it, right? Like, stop adding tons of exceptions, you're slowing down the compile time, slowing down the clock, you know, clock speed at which you can operate, just like, you know, get out, get out of its way, right?
So, yeah, anyways, like, I'm not gonna defy capitalism in itself. That's the whole, whole thing.
No, but that, like, kind of answered a lot of the question, like, just by, that was amazing. But like, my, like, the second part of my question was, how sort of like, we talk about it from a more theoretical and futuristic perspective, but like, let's say, from now onwards, for the next 10 years, we had the best case scenario with regulations, in terms of there not being additional regulations,
or, like, essentially that much, then, in that case, like, do you expect to sort of see, like, tendencies of this movement, like, going to countries where AI might not, you know, sort of be very present at the moment?
Sorry, like, countries that might have your, look, I think, like, countries that start embracing the techno capital machine and, and, like, first of all, you need, like, so, how, okay, right now, like, not, like, at the end of the day, like, if you don't respect corporate law, right, like, you get sued, and if you don't pay your lawsuits or whatever, then, then there's, there's, there's, there's, there's, there's, there's, there's, there's, there's, there's, there
people resort to violence, right? So it's like, this sort of software, the software legal stack, right, is like software version of, of power, and violence is the hard version of power. It's the hardware on which it, it runs, right? And, you know, you need an orderly society, to some extent, to run a proper legal system. But obviously, if, if the people, if centralized entities have
the monopoly on violence, like, completely exclusively, that opens up the door for oppression, right? And that's why we have the Second Amendment. Because, you know, that, you know, if, if something would happen, the government would become corrupt or something, and people would rise up, you know, at least they have some weapons, right? They don't have the most advanced weapons, but at least it could be enough of a problem just out of sheer numbers, right? So that keeps the government in check. But there are some countries that can't have functioning capitalism, because they, they, they would deform
the, they can't compile, they can't run the software of a functioning legal system that's not fucking corrupt. Because even they're like, they're, they're like hardware stacked or violence stack is fucked up, because they're always at war, like having civil wars or, you know, pillaging each other's villages. So you got to, you got to move past that first, and then, and then, and then you can, you can have this sort of, this sort of tech, right? Like, AI is running on like a crazy amount of hardware development, right? It's running on freaking GPUs
for, you know, 50, 60 years to develop the stack, right? And it's like, well, how can I run AI and, you know, in this river and using basic rocks? It's like, no, you got a lot to build to get there to be able to run that technology. So capitalism is like, it's like, it has a certain depth in the, in the tech tree of like civilizational based, like infrastructure tech. If you view everything, you know, all of like knowledge is like, quote unquote,
right? And, yeah, I don't know. I, does that make sense? Yeah. Yeah. Yeah, Beth, I think I can, can iterate on that for you a little bit is that, that free markets are probably one of the most powerful forces in nature. You know, just as you can try to change the course of a river, nature will decide what the course of river is going to be in the end. So you can fight it, but it's a losing battle. And I sort of experienced it firsthand myself back in the 80s, when I visited the Soviet Union, you know,
supposedly a country that had outlawed free markets and capitalism. And one of the first things that came in contact with there was a free market. It was called the black market, but it was a market. It just kind of shows it's there. You can do whatever you want to kind of stamp it out. But human nature is that we are traders. We like to trade. There are markets, even in China, way before Xi Jinping and any others came along there. There were still markets that were going on, even though it was a
officially communist. And then eventually they accepted the fact that, hey, you know, there's a market going on here. We need to exploit that or, or at least take advantage of that and use it. So you're right is that governments can do whatever they want to try to tamp it down. But it's a lost cause. In the end, it's what people do. It's what people want. It's a fundamental right to be able to trade with your neighbors, to be able to sort of set prices and figure out what it is you want to barter. And it's
better to go with the flow and use the markets to your advantage. And as we're seeing, one of the best ways to fight climate change is to convince the markets to do it, not to have some sort of big government programs, which are very oppressive and very austere and everything else. The reality is you want to have change. You want things to be different. You want to improve things, then more or less encourage the markets to move in a particular direction, but don't force them.
Yeah, I think forming, forcing the markets is like, there's a back reaction, right? It's a, it's like a dynamical system. And if you try to deform the markets away from their natural sort of tendencies, yeah, there, there tends to be, again, back reaction and a sort of quench dynamics, right? Like if you, if you're not adiabatically, if you have a dynamical system that's self-adaptive and you're not adiabatically change the
effective dynamics, the effective, I don't know, boundary conditions or whatnot, if you, if you do that suddenly, there's a sort of response and it's very, can be very chaotic. And so like this, this, this is similar to like, let's say, let's say Jay Powell would like fucking nuke the interest rates overnight, markets would react in a very messed up fashion, right? Like one way or another, right? Like anything you change very
discrete way, you know, really, in top down fashion really affects the markets. And again, it's literally changing one, literally one hyper parameter, literally just like they have one knob, one hyper parameter. And they're looking at like a couple data points. And, you know, it has all this impact. And it's kind of, kind of crazy. But, you know, some people think like there should be no Fed at all, like there should be no top down hyper parameter at all.
Which might be fair, but I do think like, I'm not for total zero top down control. I think like some top down controls sometimes can be used. But again, it's got to be super light touch, super light touch. You can't have huge controls. And that's not what the AI safetyists are currently proposing. They want absolute control over AI and be able to prescribe who can build and who cannot, right?
And this, this new stack, this, this, yeah. Adrienne?
Yeah, we got Psyche. I just wanted to say that I think that there is a compelling, relatively free market case, or at least libertarian or properarian case, for certain environmental regulations, right? Which is that, by
emitting carbon, that affects everyone, right? Like I worked out the atomic resonances in my physics classes, like I, this is a real phenomenon, the degree of its effect is something that can be debated, but it exists. Like that is imposing a cost on everyone. And so, distributing that cost over everyone who bears it is arguably a libertarian project, right?
Like, you're preventing someone from outsourcing their consumption of the commons, right? You're, you're actually correctly attributing costs. And I think that we can understand that without creating a centralized authority, authoritarian state that has the ability to regulate every aspect of our behavior, right?
I think we can be rather precise about these things. And when precision is costly, we should err on the side of distributed systems rather than centralized systems, for rather obvious reasons that have been spelled out here.
Great, about the philosophical elements, I think it was really interesting for me to try to address, I think this is exactly what is being done right now, where certain philosophies are being brought across, one of which I tend to be kind of in disagreement of, which is UBI, which is by execution to an extent, just socialism, practically, just recycled socialism, fit to
be integrated into the modern world. What do you think about UBI?
Yeah, well, I'm pretty much against UBI. I think like, you know, again, again, like the way I described capitalism, right? Yeah, like you have a sort of hierarchical organization of different entities, you have individual corporations, subsidiaries, and
sub teams, and so on. Capital is like, it's kind of like neural flux, right? It's a way to relay messages. It's a way to relay forces between the entities. And so it's about, it's actually about gradients of, gradients of incentives drive the system, right? Like, if you're gonna make more money, if you do this, then that's how you can, you know, if you program the incentive, you program the incentive such that you make more money, if you do X, Y, Z, then people will tend to do X, Y, Z, right
and that's how you make the system do have certain dynamics, right? But UBI is like giving everyone constant money, it has no gradient by definition, it's a constant function. And so it's not changing behavior. It's not driving any sort of behavior towards something we want, right? Like, I think the point is like, it's a fucking waste of money. And I think like,
the capitalist machine itself is an optimization algorithm that reallocates capital all the fucking time, right? Towards things that have positive utility towards growth, right? Like a company is going to manage its capital and put it towards R&D efforts that have a high likelihood of increasing the value of the company and making it more profits and growing. And it only achieves that if it provides utility towards growth of the broader organism that is civilization. And so, like,
who, like, why would you not like, why would you take away money from that optimization algorithm that has proven to be scalable? And what is it like? First of all, where does that capital come from? Right? Is it taxes? I don't know. That's not great. Depends on your taxation scheme. Like, are you going to tax the most productive people and take away their their ability to allocate capital instead of like, you're just going to give it
to everyone. But the thing is, it's gonna, it's gonna cause, you know, mass inflation, it's not gonna, we tried it, we tried, we tried, we tried it with the, you know, we had a mini UBI, during the pandemic, right, more or less, or they called the fucking checks, COVID checks, stimulus checks, yeah, stimmy checks, man. Yeah, I do that. And everybody just yellowed it into stocks. And then we had a bubble, right? And it wasn't you saw, right? But you see, like, people didn
know what to do with it. So they gave it to companies, so they can allocate the capital better, and then they own a piece of it. But um, you know, I don't know, like, if people were taxed less, and they just set money aside, invest in the stock market, we'd save many steps here. So yeah, I don't know. I'm not pro UBI. I think that there's, I mean, there's currently a plan. You know, I mean, you know, I don't know where he is now with this. But you know, Sam originally was worried
opening AI with its anthropomorphic AI would, would break capitalism in the market for for job for humans. And then his solution was to create a UBI org. You know, he's not necessarily trying to control everything. And well, I don't know, but like, you know, that that is his solution. And that's the empathetic, you know, explanation of what he's doing. Another one is like, you know, he's creating one of the most powerful companies, and then, like, everyone's gonna have to be registered in a certain database, that, you know, if you're not in
this database, you're not going to have any cash to live, right. And so it's like, you're gonna be served slop. And, you know, you just have to fight for rations, right. And it's very communistic. And that's like, it's really fucked up. I don't think any like, I don't think there's any world where like communism and giving everyone free shit is ever gonna work at scale, or produce like, good things.
Right. Again, I think, I think the purpose of capital is, it's like, it's, it's to program gradients of incentives to get certain behaviors, right? Or, yeah, that's it, right. And so, and gradients of incentives means like, there's, there's a delta, right, of capital, if you do this, you get more capital, you do that, you get less capital, right.
And if everybody has a flat capital, there's no gradient, right, basic calculus. So, I'm gonna guess it. Anyways, that's my take on UBI.
Yeah, I still remember, I made a take on it as well. I just said, hey, you know, we should really have this just UBO, which is universal basic opportunity, which is the one thing that you actually need in this world. And then somebody tagged Scott Santens in that who wrote a book about money, apparently, and he said, that's a really stupid take.
I'm like, you literally advocate for UBI, and you write a book about money. I hope you understand that inflation literally defeats the purpose of money itself, especially when the promise of it being an exchange, a small little thing as an exchange for utility becomes one that is not upheld at all, because it's broken, because it's free, there's nothing, right?
What are we going to do? And I think UBI, in general, seems to it's a mental illness in the brain to think that everything is going to be free in the future. Because the universe itself is a, it's finite, the energy in the universe is finite. How do you think things are going to become free, if you live in a world that is finite, I think we're going to look at the world, perhaps a little differently, maybe currency itself, and the very concept of it will be eliminated over time.
And say the idea of working towards a specific goal is incentivized rather, by an outcome for the general good of everyone. But it's I think it will just evolve. But I don't think that UBI or anything such as that will ever help along. So we just have to realize we're, you know, there's a certain amount of stuff that we can do and some stuff that we can't do.
And I think that over time, we should potentially abandon money as a motivator for trying to improve civilization, or rather that I think recognition should be the ultimate form of payment. How much of a mark can you make on time? I believe that's how that's how I'm motivated. I like to make a mark on time.
Yeah, I mean, people, I mean, in open source, you get like, so you get paid in social capital, right? Like people give you like, there's like a sort of status points to some extent, but then the point is that that opens up later opportunities to have, you know, paying jobs, right? Or if you've had impact, and later on, that gets you actual capital that puts food on the table, or, you know, part past a certain point, right?
Like, if you have more capital, it's not necessarily to live in fancier place. It's just like, okay, well, now you have, now you have more capital than you can spend personally. Now you have kind of votes in the system. And now you're part of, you're part of the capital allocation intelligence, right? You got to figure out how to allocate your capital on the machine to make it grow. And so, you know, I think something similar happens with social capital, right? Like, or technical pedigree, right?
Like, if you have a track record of getting shit done, that's very hard. And you've shipped it, like people will trust you with more and more complex projects and more resources to handle. And that's effectively similar to, you know, having more capital to allocate and putting it towards R&D. And so, both are sort of like social capital and real capital are exchangeable to some extent.
Just like, you know, if you have a large following, you can launch a hoodie store and make actual money, right? And it helps with rent and, you know, check out my link tree. No, I'm just kidding. It does help. I am a highly illiquid founder. Okay, yeah, and so,
yeah, yeah, yeah, yeah, we need, we need more EAC merch. I need to work on it. I just literally don't have hours, enough hours in the day. I'm either doing an interview or an important call or hardcore deep work. And, but you know, I really, really am loving how things are, are evolving right now. Obviously, get to talk to very cool people, and get to have a platform to share ideas. And, you know, I think that's a really cool thing to think about, you know, I think that's a really, really cool thing to think about
is obviously, you know, it's a lot of energy, but you know, I'm building, I'm really building the dream technologies that I've always, I've wanted to build for a long time. And it took me a very long time to discover these technologies. And, you know, also building community, I'm really happy with. And hopefully, we can keep it, keep it growing. And it can evolve, right? It might not be called EAC in a couple years, might be called something else, something something optimist, or I don't know, you know, whatever, whatever it
hates into, I'm happy as long as it has had the impact, right? Like, and we'll see it realized. And so, yeah, it's total tangent. But yeah, yeah, definitely. I think social capital is actually something that is exceedingly underrated. Like, again, this is where this is where I mentioned the aspects of looking for weight, as opposed to volume. Say, for instance, if you are like me, I don't do any of this for money. I tend to work in real life if I wanted money, right?
A lot of other people do spaces so they can get themselves monetized, and then they extract value out of the platform. I like to add value, I like to add value, I like to be useful. And for me personally, talking to some of the most interesting people in the world is more of a reward than money ever could be, because what am I going to do with that, right? You could probably spend it on some stuff, but then you'll reach that same goal. Well, what if you can reach that same goal beforehand, and just achieve much greater things. And I say even perhaps achieve a higher state
as a result of this, because you're simply learning from the very best in the world. So social capital over just normal capital, right? The promise is higher.
Yeah, I mean, you maintain optionality, right? Like, you don't know which point you're going to convert your social capital, you keep it compounding, and then eventually can turn into a real capital. But I would advise to, you know, eventually turn into real capital, because then you can take action, right? It's not just like, you know, building clout,
but like, you can, you can like, fund someone, build something, buy something that helps you even have the impact that you want, right? Like, you know, if we were to host, you know, like, for example, the EAC raise or something, right? Like, which, by the way, may or may not be an event in the next couple of weeks, wink, wink.
But, you know, I partially funded that with, you know, some of the merch sales, right? And, you know, that's an example of like, ways I could put capital towards that, or like getting a designer for better merch. You know, that's technical capital acceleration, right? Like, if a product gets used, it gets better.
And, and, and yeah, like, it's kind of like, it's a beautiful thing. And so don't be afraid to capitalize. And I think I think it should be like less stigmatized. I think we should just be more upfront of people monetizing. And I think like, Elon's change with the X platform has been net beneficial, like, people having fractional ownership over the network, the value of the network, right? If they build a big following, they get, they get some monthly revenue.
You know, I still think it's like, you know, it's pretty tiny compared to like the actual value you can get out of the network, if you find other ways to, to leverage it. Um, but yeah, I mean, for me, like, all this liquid capital from the Twitter following stuff, it's like, it's very, very minimal compared to the wealth I'm creating through creating technologies, like monetizing my, my intellect that I got through, like, literally just being
monk mode isolated for like 10 years, and just going ham learning as much as I could. And, you know, now creating really potent technologies, like, that's just many orders of magnitude more capital than what I'm getting through, through, through any sort of monetizing of this network. But like, at this point, for me, it's just, it's just a smart thing where I don't want to take capital away from, from my company, for, for, you know, yeah, related things. I
want to, to having, yeah, based revenue that I can put back towards causes, right, either sponsoring a hacker, you know, getting, getting them a flight ticket, you know, getting them a GPU or something, like, just helping behind the scenes and doing things like that, obviously, like, eventually, it would be great to start a fund or something, fund accelerators and so on. For now, only so many hours in the day. So, but yeah, social capital is like, it's like a pre image of actual capital.
But capital gives you optionality for action in the world, right, in the techno capital machine. So it's good. And at some point, you should transduce it. But that's right. At some point, having all the capital in the world gives you too much optionality. And you don't know what to do with it. Right. And, you know, at that point, like human relationships, help your neurochemistry and then information becomes more, more valuable than money, and knowledge is more valuable than money.
And, you know, I, I've had, I've had the opportunity to, to work for some of the richest people on earth and like seeing, seeing what like the end of the video game looks like in terms of like, what, what gets them fired up is very interesting, is very interesting, is very different, it becomes like, yeah, I can't talk too much about it.
But it's like, you know, they're, they're searching for meaning and impact in a different scope and scale, right. And that's sort of more powerful than just more capital at that point. Yeah.
Yeah, I mean, if you look at, say, what I mean, look at Elon, for instance, he is on paper, at least, as far as we are aware, the richest man in the world, right. But what he did, what does he do with his money just simply achieves higher, higher, higher states, like a higher goal, right? I mean, you have enough money, cool. Now go to Mars. I mean, I see a lot of billionaires and they're just using their money to do relatively counterintuitive things with them. Right? I mean, say, I'm not criticizing their
money, obviously, it's their money, and they have every right to use it as as it is how they please, right, they can buy themselves a big ass yacht that does nothing but break down and continually consume money. I mean, they make more money than it could ever potentially consume. Or you could do what Elon is doing, forfeit all this, and then basically try to drive civilization forward into the stars. Right? That's, that's something different. That's something even bigger. I think, like, eventually, if you reach certain state of capital, you could do much more. Like, even with social capital
thing, right, eventually, you make business deals, you know, the business deals then turn into a scout for intelligence, say, or utility on the platform, you can find some really smart people here. They're just waiting for an opportunity if you present it to it, right, you give them a challenge. And if they complete the challenge, then you can use them going forward for a lot of interesting things, because they will show themselves as exceedingly useful. Right? And so I think that's how then social capital turns into actual capital, we can say, you know, you find someone who's a computer wizard, like
an intern, and that intern could fix your systems, whether they whenever they break down, or even create new more efficient systems based on whatever instructions that you have, because your brain may be really good at certain things. But another person's perspective, maybe much, much better as well. You know what I mean? Like, that's, that's how I see, say, social capital turning into actual capital on this platform, where, whereas I don't believe that you should mine money off the platform in a traditional sense for ad revenue. I think it's a nice little hobbyist thing that you could do. But I think
it lies in the business deals that you can make, based off the intelligence that you find and people with utility. 100%. Got a bunch of hands. Yeah, I think we'll get Kirk out. He's been waiting quite a while in the speaker request section. So I'll go with Kirk out, then wiggle in Sierra. Thanks, Adrian, man. I appreciate that. This conversation is really quite awesome. As you know, Adrian, I come from like a political organizing background. And
even though I probably hang out with more people that lean left, I am just a rabid supporter of economic agency markets, free enterprise, voluntary association, I think more and more and more of that is actually the solution. And to that end, I actually think one of the biggest bottlenecks we have in capitalism, which does, in my opinion, generate some real grievances and some real meaningful misalignments and
stakeholder interests is that there's actually relatively a lot of friction to, for instance, P2P money transfer, and like gifting, as well as community capital formation. So like you have the accredited investor thing, the only way to start a business is if you have enough money and are allowed to, and the banking system, the money comes from the center, it goes to the banks, the banks get to choose who gets to try, etc, etc. And there's no, that's the top down. And then there's no countervailing bottom up infrastructure.
I don't want to say no, not enough of. You have Kickstarter and you have Indiegogo and all this stuff, but it's so one way. There's no reciprocity, there's no way to build networks, like social media, of reciprocal economic relationships that are non transactional. And I'm actually really anxious to see what x chooses to do with the money transfer license that they're going for for this platform, because to me, I think the greatest thing we could possibly do,
not just for capitalism, but maybe for all of civilization, is to make money as frictionless to move and to share and to propagate as currently ideas are, especially on platforms like x, especially in line with the propagation of those ideas.
So just imagine right next to your like, your share and your comment button is like a boost button. And if you threw a penny, a nickel, a dollar, whatever, all or almost all that's going to go to the person and maybe just a tiny bit of it will go to actually like, boosting the post itself, rippling it out a little further on the platform.
And then just imagine that everyone is able to express at any given time, their need, their interest, their idea.
And then, as, as quickly as it can go around the world, it could also be funded micro funded at that spreading the risk that most capitalist endeavors use as an excuse, I would say, or a rationalization for value extraction, we deserve profit, we took the risk.
But if you spread the risk out, it kind of blunts the argument that we should get profit, but we can still be collectively funding through market forces and through the pooling of capital, we can still be funding all of our needs, all of our desires.
Thanks for letting me say that. Curious if that bounces off anybody in any particular ways.
Yeah, I just wanted to say you said you wanted a system that is reciprocal but not transactional. What is the difference between those two concepts because to me they seem identical.
Well, I'm not saying one instead of the other. The idea that someone has a gift that they want to offer the world some skill, you know, whether it's software engineering or, or dry cleaning, you know, it doesn't matter.
It's totally fine to have a transactional relationship somebody needs your service they come and get your service they pay you for your service.
But what I mean is that so much are of our economy is transactional. It's actually taken a lot of the humanity out of out of the world and I mean even charity, the United States, we are so charitable, but how much of that money actually reaches its, its desired destination.
How much of that money is actually being utilized properly.
Very little, and I think part of that is because we don't, we don't really have a social, a digital, durable social context for giving such that if somebody expressed a need rather than having to find a nonprofit hopefully to help them, they can just express the need.
And people can just, and like micro gifting just a nickel, you know, we have hundreds of millions of people on this platform if resonates, the money's going to get to you.
And that's kind of what I mean by non transactional is if someone expresses a need, and I have any sort of warm sentiment for them at all. For one thing to give them a penny is nothing but it could add up but I could give them a dollar I could give them five I can give them 100.
But it's not transactional and so far as they expressed a need and I felt good about helping them reach that need, not necessarily with any idea of getting anything in return. However, with the dynamics of social media, since I gave them some amount of money.
Maybe the way the algorithm works is if I ever express a need that has a dollar amount to it. They'll see it in their feet. I've given. Now they have an opportunity to give back, but it wasn't transactional. Does that make sense.
I mean, it seems you should pick one way or the other, right like either they don't understand.
No. Well, so either you are being purely charitable, right like either you are giving to somebody who you don't feel has the capability to ever pay you back, or it's a loan, essentially, right, like, or you are giving with an expectation that you will get a net return on average.
So I think you need to be very clear about what your goals are here. Are you trying to be charitable? Do you have an internal value system that says, I'm okay with taking that losses because this achieves some moral or aesthetic end for me, or am I making a business decision and being coy about that.
I feel it's very cowardly, right? Like you're trying to have it both ways. You're trying to say like, you're trying to say like, this is somehow simultaneously charitable and moral and a sound business decision.
But those are in conflict. And you should own that conflict. I think the brave thing to do is to own that conflict and to say, these are the circumstances in which I am making a mutually reinforcing like a business decision.
That is a positive feedback loop. And these are the circumstances in which I am feeding resources into a negative feedback loop. And you should be straightforward about that rather than disingenuous.
I'm not, I mean, maybe I just didn't explain it very well. I'm not, I'm not sure you actually grasp what I'm suggesting and that might be on me, and that's okay.
But for me, what I'm talking about specifically is agency, maximizing individual economic agency. So it's not about being coy or not being coy. It's about individuals and communities being able to express their needs.
And as that expression of need reaches other people, other people having the agency to choose whether to facilitate that need or not.
Now, at the expression of that need, it's on that person expressing or on the person considering giving to get clear on whether that's a gift, or whether there's some expectation of return.
Which is exactly what I said. Is it a gift or is there an expectation of return? You're just rephrasing what I said.
Well, I mean, I'm rephrasing because it sounded like it kind of came at me as a little bit of an attack as like a suggestion that I didn't know what I was talking about.
But I'm just expressing mechanistic opportunity to increase economic agency. So individuals choosing whether to give or whether to loan or whether to invest, that's totally up to them.
But we do need to increase that agency, because right now, people have an experience that most of their life is determined top down. And that's not exactly incorrect.
And part of that is because when they look around and they say, hey, this kind of sucks, and they could look around with 100,000 other people and say, this kind of sucks, there aren't really readily available, let alone readily, like, effectively modeled tools to organize around a perceived issue.
There's a lot of expectation that our government is going to do it for us or some other white knight corporation is going to come and do it better or an NGO or something. And it's all top down.
I feel like we should have the right to fix the puddles ourselves as we want to and then hold the government accountable for the fact that they weren't able to do exactly that. That's my personal opinion.
I think if you want accountability, you'd have to have more of a direct accessibility for people to government and their ability to say kind of control what they're doing to more of an extent than we do now.
I think by and large, when we look at charity and say crowdfunding in this sense, I think we need to put less control on that. I think we need to have more transparency.
Like, here's the thing. Let's imagine for a second, there's a big asshole in the ground in the street and the government's not doing anything about it. Well, here's what we're going to do.
We're going to make a big amount of noise about it. We're going to see if the government is going to fix it for us. And if not, then we'll fix it ourselves and then we'll hold them accountable and make sure that somehow the money that we spend on fixing the problem that they are supposed to fix is then deducted from all of our taxes collectively, however much was spent by the individuals who contributed to the charity.
This, I believe, would definitely start solving problems for the government. It also ups the standard because we did it with our own money.
Actually, Adrian, they already have a solution for that in England that is working quite well. And that is, whenever there's potholes, people go out and they draw an enormous penis around the pothole and it gets the attention of the government and they come and they fix it.
Seriously, that's what someone figured out about 10 years ago. It got their attention and it doesn't cost a whole lot. But you're right, you've got to get their attention to what the problem is and eventually they will fix it if you make enough noise.
So yeah, I want to say that I simultaneously agree that decentralizing charity is a very good idea and especially rendering it non-compulsory. So as it currently exists, foreign aid is mostly done through the taxes of various governments, which is a non-voluntary system.
People are people, you have to pay your taxes. And so whether or not you think these causes are efficient, whether or not you think they're good, whether or not you think they align with your moral values, you're compelled to give to them.
So any system that decentralizes that, that removes control from unelected, and
I did. You're rugging a little bit psyche. I'm sorry. You were rugging just a little bit. I'm what? Sorry.
Rugging? You were rugging. We couldn't hear you for a second. Oh, sorry. So yes, I both support the decentralization of this project and question the fundamental project of endlessly contributing to something that doesn't generate.
Right. I think that any. You're rugging again. Okay, I'll just stop, I guess.
Yeah, I think there's no, it seems to be better now, but sometimes if you drop down and come back up by speaker, it fixes the problem.
So that's just the internet. Are you using mobile? Like what are you using right now? Using say Wi-Fi network or using?
Yeah, I'm on Wi-Fi. Can you, can you hear me? Yeah, we can hear you. Yeah, much better now. Yeah, I was just cutting out.
Do you know what they call Wi-Fi in France? Yeah, not what they call it. We see. It's weepy.
Yeah, bullshit. They use Wi-Fi six Wi-Fi material. It really speaks to, you know, American control power. Yeah, I would say that like,
Hm, hm, hm, hm, hm, hm, hm, hm, hm, hm, hm, hm, hm, hm, hm, hm, hm, hm, hm, hm, hm, hm, hm, hm, hm, hm.
Okay, I think your mic died. Yeah, now you're rubber banding.
That's not a rubber band. It's just that your mic is completely non functional.
uh... he just has a lot to do
yeah, he said i have to go to 50 minutes
and now it's 50 minutes we're up
yeah, that was good though
i didn't get to make my comment
Um, I know you didn't get to talk to the real Beth, but I can be like that knockoff version
I can be I can be uh, what's that wish dot com beth? So here yeah, which dot com discount version
Which comes even worse by the way?
Okay, then do it like why did you do yourself after saying that oh, I'm waiting to hear her comment
Well, we go ahead his hand up before me
So if you want to go first
No, go ahead. It's all right
Uh, well the conversation has drifted quite a bit. Uh, I just wanted to comment a little
Or perhaps offer a different perspective on
Very much of this is going to depend heavily on whether or not you're a believer
That agi will bring upon an age of abundance. Um
Uh, I don't like the term ubi. First of all, let me start there ubi as universal basic income
The future but I do believe um in a post agi
Necessities are met absent of human labor
access to food water shelter, um
where people really won't
necessarily need to work to live
Kind of beg the question of what life in a
Society will look like. Um, obviously I believe there will still be people who
Want to work, but I think we might shift more towards a
participation-based economy in that respect where
Universal housing is available to those who
Are not able to afford it because it's able to be achieved
Fully autonomously like we have hunday excite solutions who has just committed at ces to
fully autonomous end-to-end construction sites spanning all sectors from
Commercial to residential to infrastructure
Uh by 2030 so that's six years away
And they've committed to and they're responsible for a lot of large-scale
construction projects across the globe
Um, and there they will have removed all
Um, so and that's pre-adi like we have not even achieved a gi. Uh, uh, I think
We all know a lot of jobs are going to go away. Um
We won't have the means to live
needs are met absent of human labor
uh, what is money after that and
So to me ubi or what we currently refer to
As the idea of ubi is not going to be everybody getting checks in the mail monthly
In the form of those basic necessities being met and I think that turns our economy into something entirely different
We're not all but most of us strive for more than just our basic needs met
Uh, but I think the dynamic of the economy is going to change
in ways we can't quite imagine and
some sort of distribution
Is going to redistribution is going to come into play just here due to the sheer abundance and the lack of scarcity
everything that our economy
Functions on right now circles around the idea of scarcity
What it's going to look like when scarcity is not
So I do believe that there is some form
Ubi, I gotta come up with my own name for it
Psyche do you want to talk because uh, I I disagree with you sierra, but I want psyche to talk
Yes, so first I would say that the idea of can first can everyone hear me?
I just want to solve that issue
Okay, good so first the idea of post scarcity is a pure fantasy
Because everything is always finite, right? There's always a finite amount of energy. There's always a finite amount of knowledge
You imagine that at some point it will be a big enough number that it's close to infinity
But no numbers are close to infinity, right? Like all numbers are infinitely closer to zero than infinity
So post scarcity is a non concept. It's it's gibberish, right?
and and and secondly why do I disagree
I think eventually he will he will change his stance on ubi quite conservatively. You can already see a light shift
In that uh, he's suggesting that it can still work in the future
I think largely if we are going to head into a future again, like if we if we head into a future
Which is so advanced that money ceases to make any sense
Then we need to look at what exactly we can be rewarded by like you're existing somewhere on the planet and you are
Accomplishing some sort of purpose. How are you being rewarded? What do you really need accomplishing some sort of purpose?
I think that's really essential because
This idea that like we could become completely useless right that everything we do
Is better done by some other system and we are just fed resources as as some kind of cancer
It's like a tick that's just you know, you pump blood into it
Even though it serves no purpose at all like that who would want that existence? Do you want to be that?
Do you want to be a completely I would say that's only sustained by the resources produced by some higher system
No to me the only hope for the continuance of our current consciousness
Is merging right? So so I think augmentation is a genuinely optimistic path forward
If we yes, exactly, right?
If you can make yourself greater if you can enhance your own ability to produce knowledge to produce value
That's a genuinely optimistic like life affirming path forward
But it is not life affirming to wish to be a tick
It is not life affirming to wish to be a a completely useless appendage
To some larger system that produces knowledge and merely supplies you with resources. That's that's hideous. That's disgusting
if I may propose any sort of like
Like edenic system where we have no agency. We have no control
And we are we are merely subjects of this higher level ai
That is that is not an end that is either sustainable or desirable
If I might offer a different perspective
Male people uh for a moment people who deliver mail
How many but I know a gentleman who for 33 years he's worked with the postal service
He's actually a fairly intelligent, dude
How many hours of his life have been wasted putting envelopes in a box?
When his intelligence could have been used
In so many other ways and there are so many mundane jobs
That I don't think anyone wants to do they do them because they have to be done
But when jobs like that are replaced
the consciousness of our society to go out and create and explore and
build more. Um, I don't think that
met so food water shelter
Absent of human labor is going to be the death of humanity. I think it's going to be the birth of
of humanity people will be able to use
To pursue things that they are passionate
about and I think when we all think about the things we are
Best at the topics that you're most knowledgeable
In they typically are ones that you are passionate about because you wanted to learn about them and you made yourself an expert in them
They see what you're saying, but I think we're picturing very different futures. The truth is
what the other side of the event horizon that is AGI looks like
And that's not to say you can't continue to be augmented
just pursue things that are more
Important to you and that's where I said, I think we might adapt more of a participation based economy where
Sure, if you want to sit home and do nothing because you
You can but I don't think a large majority of humanity will do that
People get bored and will start to pursue
passions and things that they're interested in
And I think it'll open up a whole realm of new industries and possibilities. We might move towards some
glorified version of the barter system where goods and
Maybe through a barter network. There is no more ubi
Because I think trading information is the only way to go. I don't I don't like the term ubi
I think it will be eliminated. I think the very idea of utility will be reduced to its purest form
and that would just generally be
There is absolutely no real need for money. There's only need for utility
If you look at if you look back at how
Evolutionarily things have shaped out you begin to realize that
Someone has always accelerated
Accelerated doing one thing, right?
And then that person was able to maximize it to such an extent that they became the best at doing that specific thing
And then they themselves had a product that somebody else needed and that somebody else
Did this exact same thing but did it for another place in the market?
And then that thing that they produced was also needed by the same person who produced the thing that the other person needed right now
You have a trade now you have actual utility in the world. I think in which
Physical scarcity as we know it today would probably disappear. I think experiences and information would be the only real value here
I think over time we need to start considering that maybe as humans as a flesh form of human
We will eventually cease to exist
I I personally feel like that the concept of a biological human will cease to make any sense in
This is not a bad thing. I I think it is actually a good thing because we can
This meat container that we currently are at
Right. We'd have to be simulated. I think it's a simulated civilization
Is is much more is much more scalable than a non-simulated one, right? Like we are now
keeping us alive is exceedingly expensive and
It's expensive in terms of like what we need to do to the environment to achieve something that allows us to be
Alive, right? You could even say explore in the future
A possibility of removing ourselves from our bodies
Via say creating a machine that nourishes our brain is exclusively
Where we take out the brain from a body that has grown up to be a certain age
I just imagine I think within
Up to up to your up to your 30s
I believe that is what you could potentially say extract the brain out of the body and put it into a machine case
Right, I believe that's something that could be explore explore explore in the future
Right. We want to go full robot mode, but at that point you have to understand what you're discarding
Yes, I have to understand that you're discarding life forms out of which you are made
But I think that will this will be understood at some point in the future like right now we're at the baby steps
Making assumptions as to what happens
I don't think we should worry about what happens when we reach there or we reach a point where let's say
Scarcity is eliminated in a form where we don't need money anymore. I think we should worry about how we get there first
That's that's how I see it and currently I believe we are bombing each other to hell with competing ideas of liberty
And I think that gets us nowhere
Understand well, yeah, this is a mechanistic reasons, but it's still nonsensical like
The agi isn't going to save us technology is great, but it's up to us right now
and uh, if we're not like striving to coordinate effectively then
and that doesn't mean that we have to all like agree or anything like that, but just uh
It's it's us we're gonna get us there or not so
um, all that makes sense I think on the just the um
I like the term terminology that I like to use is accelerated fundamental resources
There's a lot of things that capital
makes ubiquitous and cheap
But the reason that is ubiquitous and cheap is not
Like it's capitalism. It is the
You know techno capital machine like working which is is us like so
Some amount of automation or a great amount like can continue to accelerate some of those resources
It's also unlikely that that whole machine keeps working
without something like us
so whether we become the machines or
Or otherwise like we merge or or whatever version of that. It's gotta
There's there's no perpetual
There are agentic things that can
sort of channel and select
thermodynamics like directly but
Other than that until we find some other fucking physics or something it's uh, it's us so
We consider ourselves too much with these kinds of things honestly
We could we could develop much faster
If we are to not debate how we're gonna get out like what?
I think everything we need like an existential business
No, no, it's this is whatever you have at current is enough for the next steps forward
We just need to get out. We just need to pull our heads out of her and ask with this one
Honestly, right like we we have the ability to explore other places say for instance
You go to mars and figure out why that planet has been nuked
Then you go to all the other planets and figure out what's there and along the way you'll probably discover some really cool stuff
That takes you outside of the solar system and then beyond that perhaps into the galaxy
Maybe eventually if you understand physics enough takes us out of the 3d realm
Which is essentially where we're at right now. It's essentially just us being left behind by a window
like we're just stuck in a moment and
We're just not going anywhere
We're literally not going anywhere time is not time is not moving anywhere itself. Really. It's just we're kind of
Sitting in place. It's weird. It's real to think about so we need to get to a place where that's not the case
But before we do that, I think we just need to focus on how to better build things
And I think if we look at controlling finance or incentivizing finance in certain areas
We start to really lose side of the actual problems that we seek to seek to eliminate
That's the only way I can see the only way I can explain it it's mental illness
This with that. I mean anything that is counterintuitive to the detriment of humanity
Like say if you're hyper fixating
What is the nature of your sex? What is the nature of your appearance?
If you're hyper fixating that over utility and merit, I believe you're mentally ill
I'm not sure what that's how that's exactly related to the I mean just build the fucking future is yeah
I think we're saying the same thing
Yeah, yeah, but that's what I said before that too. So i'm not sure what the
I feel like I also I don't need to speak to that
In case it's interesting to people
So I I think I might have an interesting can everyone hear me
I can so I I might have an interesting perspective on this as a D transition
I am male i'm a male person and I transition to female and have
Transitioned back and my surgery to remove my breast implants is coming up in five days. So
I I feel I understand this particular psychopathology fairly well
And I understand where it comes from why is someone moved to do something so strange?
Why is someone moved to do something so?
Evolutionarily novel like where does this come from and I think it comes from ultimately an evasion of responsibility
Right. So for males like me
I think this drive comes from an evasion of the responsibility to achieve
Masculinity to achieve maleness, right?
Masculinity is is a very hard thing. It involves taking responsibility
It involves taking the resources that you were given by your ancestors and converting them into something more into producing interest on those things
That's hard. You can fail at that. Right many people do fail at that. Maybe 40 fail and 60 succeed
And and and that's scary and that's hard
And it's always easier at any given step to opt out of the entire marketplace to opt out of the arena
And to say I object to the very idea that
Success and failure are a thing, right?
That's it's a it's a great that's a great line for failures, right?
And I I think this aversion to pain
Is the basic failure mode of our society this this idea that that no one can be allowed
To fail this idea that no one can be allowed told they're wrong this idea that that no one
Can be a loser that we have to come for everyone that we have to tell them how
Their thing is secretly a part of success
No, like some people just lose some people are just failures
And until we accept this pain until we were able to tolerate pain
We will continue to degenerate in the same way that a population of deer
degenerates in the absence of wolves, right like selection pain death
Is an essential component of maintaining and advancing life?
And I I think that this lack of duality this emphasis on love instead of hate its emphasis on
Mercy instead of cruelty is is destroying our society. So that's that's my perspective
First of all be careful generalizing your experience to
Everyone's experience under the same sort of like bucket because there's eight billion of us so far
And we all have different like there's at least a few reasons anyone might do any any given thing
Overfocusing on love or mercy etc. These are spectrums and their scales
So if you're living in a world, it seems really cold and it seems really hateful and it seems really cruel
Maybe you try to focus on love. Maybe you try to balance the scales. Maybe it's even a design opportunity
For something that's more equitable that just straight up works better
That even capitalism which has arguably done more for humanity than any other organization structure
That it too seems to not be performing optimally and that in not performing optimally
It's creating a lot of negative externalities
And it seems to actually be generating a lot of a lot of pain
Is okay. Let's do this. What you're what you're like
What are you comparing it to?
like not optimum as compared as compared to what system that actually is compared to that has been
Demonstrated as compared to possible
As compared to like let's analyze how money moves and think of other ways for money to move
I don't think it's capitalized. I don't think it's capitalism perform performing or not performing. I think it's just people not
Understanding the concept enough to no, no, no, no. It's
It's very it's very specific
Mechanisms I even I really try to pull out of the politics and out of the feelings. It's systems design
Are we missing mechanisms? Are there mechanisms that aren't working?
We're missing too many regulations. We're missing a lot
We're missing nothing. We actually have too many things rather. I think it's the regulations themselves specifically that's out of the way
Well, like paypal was the first way we actually got to move money digitally
That was a big deal because we're moving into a digital world
So that's a powerful thing if we didn't have paypal and never made it you could make an argument that we're missing
Paypal we're missing that functionality
Um, can can I maybe uh interject here? That's just late. I think that's not an additional function. That's just latency decrease fairly
neutrally, right so let's do more of that. So there are other versions of latency decrease that I think are are optimal and
And are ripe that the time is is now to to look into additional latency reductions
I was going to say so the idea is that capital can be more efficiently allocated in some way other than that
market forces and I would be curious to hear your explanation for
Specifically mechanistically how market forces have failed to allocate capital and I could not possibly do that because it's not a claim I was making
Okay, so so what claims are you making?
I I mean I I made it in in my original um in my original rant, uh, but it's
gift button at a at a money button to
Exposts as a for instance like literally mechanism. It's it's not grand and sweeping. It's the tiniest little thing
um, but but to be able to move money
As frictionlessly as we currently move ideas and in fact in line with them
As a mechanism, that's not less market forces. That's more market forces
It's leaning in not out. I don't I don't have beef with with capital or free enterprise or free association
We need more of it. We need to remove the latency
But uh, can't you move? Yeah, i'm fine with that. That seems yeah, you can do tips on the on the profiles
Yeah, so okay. So this is a great point, right? You have tips and you have subscriptions
Tips has tips has friction
Um, you could give a tip. Do I need to solicit that tip every time?
um, and uh, you know, etc, etc, or do do I need to hope that you think of it when you see something you like from me
And subscriptions which I which I did sign up for and I would and I would like to
Um with the lowest possible subscription amount a dollar
But but like subscriptions is also not quite the right context because it assumes that you're a content creator
It assumes that you're you're offering something specific. Hey buy my thing
And it's not exactly what i'm thinking about here. So again, it's a latency reduction. It's um
Hey flat, I just got a flat tire. I am so screwed
You know, it would take nothing. It would take so nothing
For for that money to to come in from like a nickel per person. It wouldn't even be a big deal
And this actually gets to the idea of like scarcity versus abundance
Scarcity versus a button abundance in the moment is really just when I reach for something. Is it there?
and so economic scarcity has a lot to do with
Um things like porting which we all naturally do
Um and some some latency that that isn't really helpful that that we could reduce
So if anyone is reaching for something and they can just express that they're reaching for that thing
And i'm not saying they deserve the thing i'm saying we all deserve
The ability to make a decision on whether we think they deserve the thing if they're asking us
Um, and I and I actually think and this is just like the gifting portion
I have I have like some crowdfunding concepts that are also deceptively simple even though the devil's always in the details
Um, but there's mechanisms here where the crowd is disempowered where sort of like formal finance
And like top-down government are are overpowered. Uh, we're missing a check there. We're missing
So essentially what you're saying is let let me let me let me simplify the business deal here
You want to implement functions into the platform that allow for a more streamlined?
Ability for people to supply money to someone else's needs. Let's say for instance in the example
Someone is in need and you're like, hey, I think I can help that person out here
Let me throw a little money at that much like basically what?
A kickstarter is doing except every single post is like a kickstarter campaign
Is that what you're telling me?
Uh, yeah as as broad strokes the devil's in the details, you know, i'm not too hung up
Yeah, yeah, but the devil is also the execution of the idea, right?
If you came to my office and i'm supposed to fund
I'm supposed to fund what you're trying to give me
My brain's going to turn off before you get to the end of the conversation, right?
I don't want to have to try and piece this together in my own head
I want you to give me the simple idea and i'm like, okay, that sounds interesting now
How are you going to make it work? You know what I mean?
I want to have I want to know what's what problem you think exists
And I want to know how exactly you think of solving it and how you're going to create
The solution to the problem that you think you're solving with the solution that you have
That's yeah, right. It's not you're that. Yeah. No, that makes sense
I i'm fairly optimistic about this, right? Like I think that
Reducing transaction costs, right like and reducing
informational costs, right like increasing the ability of people to fund things that they think are undervalued and and decreasing the ability of
Overvalued causes. These are all good things, right? Like more information is always good
The the only thing that's lacking here is the the concept of an enforcement mechanism, right?
So this whole works as long as someone is able to enforce their concept of debt
At gunpoint essentially, right? But if you add to this system
An endless debt forgiveness, right?
so if someone who finds themselves endlessly indebted say by student debt or by
Debt according to the mechanisms you would seek to create
Is able to appeal to the masses and say you should forgive my debt then the whole thing breaks now, right?
Right like a distributive systems work
Debt systems work as long as you are willing to bite the bullet on what debt really means
And I don't think you are
Curious yeah, i'd be interested to have a deeper conversation about that because it sounds like it might be fruitful
I like to think of a pay it forward model than a pay it back model personally
Because if you if you're out there asking and asking and asking and you keep taking and taking and taking
Especially with social media you have websites you have profiles so you can sort of have a durable reputation
What have you gotten money for what have you given money to?
And are people saying no, wait, wait, wait, this guy's a total scammer. He's just he's kind of just an asshole
Ultimately, that's going to catch up with you
And so little by little the cream sort of rises to the top
People who people who ask for money to do something that other people see as valuable and then they go and do it
Um, and then they ask for something else. Well, that's an easy. That's an easy. Yes, you know what I mean?
And then the takers obviously that it's always it's always another story
I just need to get I just need to catch the train to visit my sister, you know, it's like, okay
Yeah, whatever dude eventually
They might fall by the wayside
The the debt trap the pay it back trap is that the intention of giving you the money to begin with wasn't to empower you
The the intention in debt of giving you money is to get more money back in return
I actually find that to be a misalignment of stakeholder interests. I think that creates
Negative situations at least at the bottom at the bottom and in the middle as as the population gets squeezed
It's kind of a it's kind of a different phenomenon at the top
But this is this is a much bigger topic that gets into social sciences
And our lack of of shared identity and purpose
And how when you have your own community and you're forced to make decisions
Through a a structure like government that requires you to reach consensus with other communities that don't agree
You're going to have an existential fight or flight response and essentially be trying to like stamp each other down
And we actually need ways of sort of going
We need to be able to like independently crowdfund with each other and go try things like humanity is a laboratory for
Um the the coercion towards consensus. That's actually what nobody wants
yeah, and and I think that's this is
the the basic issue is that there needs to be an opt out for the subsidizing of
Non-productive population, right? Yeah, that should be that should be a voluntary activity
And it can be voluntary on a range of scales, right?
people opt into things for a variety of reasons and they those reasons should not be
interrogatable by the masses they should be up to the individual
Mandate that right when we say you must subsidize this you must
Continually sustain this particular mode of being
It creates tension and that tension must be resolved eventually
That's my perspective. Yeah. Yeah. Yeah. No, I I share it and that's that's one of the inspirations behind
my thoughts on maximizing individual economic agency
Is that we can actually start putting money directly where we want it to be?
Uh, and if you add some like, um some more open sort of community capital formation crowd funding
Uh, i've i've got sort of like I want to see the x prize go open source
So that we can just start throwing money on the table left right and center to open source solutions to proprietary
Um is is that we can actually start saying, you know, thank you governments
Thank you multinational corporations
But we'll take it from here and just like one by one we can start crowd funding and and giving money directly to things
That currently there's actually no incentive to permanently solve because it's too profitable to keep it going either financially or or in your ability to keep
Power, right like political parties constantly pandering to their community saying all the right things and doing nothing is so profitable
But if we said, okay, we're really tired of being pandered to how about we just like organize and solve these problems for ourselves?
Well, that'd be pretty sweet. Uh, but I don't I don't think we have the proper mechanisms for it just yet close
But not quite but no, I completely agree though
Like that that is that is what will solve this ultimately is the restoration of reciprocity, right? Like yeah
It is not sustainable to ask of any population of any kind
Non-reciprocal relationship, right? Like absolutely on all scales too, right? Yes, right individual up to society
Yes, you have to give eternally to this other group with no obvious return to you
No, like anything worthwhile is a positive feedback loop
and anything that does not meet that criterion is
Should be at the very minimum deeply suspect. So yes
I agree. I agree. Yeah, sweet. I'm glad we managed to get around to that. I think the fundamental like um
Like bottom-up first principles here are just that humans work a particular way
And uh, just like if you try to build engineer or something
Without sort of the first principles of that domain
Um, it's not it's not gonna go well and we're
Yeah, we're just not very good at like observing and like
testing systems against our own first principles or even like trying to and like, um design things like for
How we are psychologically and socially. Um, it's not
Generally the affordance that it's usually a physical thing. So
This is entirely arbitrary
But I think it would benefit all of us to hear what Brock has to say
Yeah, I feel like that's another space but
I don't want to rant and like take up all the time your time here and I'll get down the rabbit hole very quickly
I guess the short story is that um
The way that we coordinate naturally is not directly game. Theoretics game. Theoretics is what we do
When the natural coordination mechanisms that we employ fail
It's really helpful at scales beyond like local effectively
um, but there's no reason that just like we extended the rest of
our natural capacities mainly intelligence and
like actuation like we we can
A sort of social coordination
You might want to investigate your audio equipment
Can no one else hear me actually I can hear you like perfectly what sometimes happens is that
Uh in spaces as a speaker you might have to like leave and come back
And so usually it's the person who cannot hear
Yes, like you can drop and come back
Just amusing about scarcity and the fact that there can be beauty in scarcity
I think it was about 15 20 years ago
There was like some some rains in the the deserts and in arizona and suddenly the desert just bloomed
But these beautiful desert flowers and it's just really rare. It happens like once a decade or something like that
And I remember thinking to myself. Ah, it's too bad
It just doesn't like you know rain more often so you can see that site more often then it began to realize that like
Well, if it did not only would it not be a desert
It would probably turn into a grassland and then eventually into a forest and you would have completely lost
What was the beauty that was coming from that scarcity?
so a lot of times when we start talking about post-scarcity, we have to be
careful what we wish for because
The unintended consequences could be
Um, you know kind of the the lack of what we want
and again, I think we almost fool ourselves if we think
having a post-scarcity world is possible because eventually
Resources and everything are going to run out. I'm not trying to be malthusian or anything like that and say that we have to
decelerate or anything like that, but we do have to be
A bit more realistic that you know the the bacteria you put into a petri dish
They have a grand time until they get to the edge of the petri dish and suddenly it doesn't work anymore
and that's sort of the same thing is that when
You've got a couple things that are going on as those desert wildflowers. They figured out how to deal in a in a
But as soon as you start bringing water in there
Well, that just starts attracting everything everything else starts going there
and suddenly the battle begins on whose resources are these they start fighting over it and
In a supposed post-scarcity world so we can have this age of abundance where the bots are producing what seems to be all our needs
But i'm going to tell you there's still going to be a limit on how much each front property there is
you know, it's and and people are going to in a sense be fighting over that because they
Property is someone else. So how do how do you deal with those kinds of disputes that are kind of come along there?
I'm really open to all these ideas that i'm just talking about it. I'm not on any
You know for upi or against upi or anything like that. I am in the information gathering phase right now
I want to hear all the opinions
I want to hear everyone's ideas and I agree that problems a lot of times are problems are friction in the system
Completely agree with what kirk out is saying is that there's too much friction and x I want to be able to tip really easily
I want to be able to tip small amounts and not dollar amounts because a nickel can go a long way
Someone has a banger of a post and everyone realizes that they will do it and that person will get it and I would rather
That x not be made up of creators
Because you have like the creators and then everyone else who kind of views what's going on
It has to be equal and democratized and this whole idea of subscriptions and everything has made an elite class
within x which I think drowns
Creates how do you know first saw that scene in the magic christian?
Where they have this big vat that's just full of urine and feces and everything else
And they dump a pile of money in there and sure enough people just jump in there to grab the money
And that's exactly what happened to x as soon as
It opened up that way and everyone just started shit posting and everything else went out there
And I didn't want to have anything to do with it
I never signed up for stripe
So I haven't seen any of the benefits everything on there because it's like I don't want you know, I want to post
Honesty is out there and then if there is a mechanism that someone likes my post. Hey great
um again, I just think you you you've got to make it in a way that there's
People are encouraged to write a quality post not quantities of posts and what we were seeing was just
But I had to start muting all kinds of people that I followed because it was just getting
I just have one thing to uh, like quickly ask
Hey, pardon me. I pardon me. Sierra had a hand up for a while
Oh, i'm sorry. I didn't see that. It's all good. Go ahead. Sierra
I was gonna wait until scott was finished. Yeah, i'm pretty much good. I was trying to fill the dead space waiting
for everyone else to finally have ideas
I just wanted to uh came off of
What you said about there will always be scarcity and beachfront properties is a good
Point of reference, but I want to first clarify that
I was I thought pretty abundantly clear that I
basic human needs scarcity of
necessities so the things that keep us alive and I would not categorize beachfront properties as
that, uh, but additionally I think that um
AGI world comes into play is that we will have new resources, uh and new
potentially energy sources, uh, especially with the
Business that is starting to take off and as we start to explore space deeper
sources of energy and new resources
It will take the place of those that are
Yeah, there will always be scarcity
Uh as elon quote i'll quote elon
What he said during opening of giga, uh texas
He said there will be a world of abundance. The only scarcity that will exist in the future
Is that which we decide to create ourselves as humans?
But before we go hey scott
Check the i'm so quick. Yeah, I need some help on this cat kid
I was just saying that um
I think uh scott mentioned something
You know how microbes or bacteria?
Would compete against each other because there would be some kind of
Limitational barrier that would cause um, like some form of nutritional scarcity now
It's interesting that you've mentioned
Malthusian economics right after that because my point was kind of related to that
So if you think about it, there are these like
Systems or you could say like ecosystems where you would have these uh natural sort of
nutritional dynamics like nutrient cycles of different kinds like
You know whether you're on sand or underwater
But if you think about like humans at the same time, which are kind of some complex
Organisms that were created from those microbes
nearly as much of a food scarcity issue
as was sort of predicted by
and then like I I know that like it sounds very different from what you were saying, but just to like kind of
That's 100 accurate. But at the end of the day, it's like the bacteria
Can like multiply until they reach that point, right? Like it's like a
The compet the competition past that is almost like part of how we see
uh biology like the that competition between bacteria is a huge part of how we look
at bacteria in the first place so um
Like all of that if to me, it sounds like at the local level
Different than at the global level having scarcity
um now if obviously one area scarcity than the world has scarcity, but like what i'm saying is in terms of your
Like that's a completely different sort of discussion than like if you're talking about an individualized area
If you know what i'm saying
Mean I think this all sort of comes down to physics is uh, if the space time interval is space time interval
And thermodynamics is thermodynamics. There's just not uh
Things are finite on a like finite space
Space time reference things. So there's contemplating right now
I'm just yeah, and i'm saying sort of scott's correct here is all i'm saying unless we find a way around that
something like scarcity in the limit
and what adrian was saying earlier about sort of you know
like some form of co-evolution or merging or anything like that like
Then we will need different resources
Different things will have different like utility and value to us and those things will then be the like layer that scarcity
You know presents itself. So
Maybe beachfront property is no problem, but nobody cares about beachfront property at that point. Maybe we're worried about saje star
front property and there's only so much of that space to go around so
Yeah for everyone who just joined welcome to the space
We had a great conversation with uh based bev jesus a super intelligent human being
It was an honor to hear his views on various
Scott and all the speakers that you resonate with we have this space as daily covering tech robotics and ai
Uh, if you have a question, please request to speak. I also join our accelerator community, which is the fastest growing community on x
We accelerate advent of ai and the progress of humanity expanding the scope of consciousness
You can find the link to the community under adrian's profile as well as other speakers
All right, all right, so let's see, uh, yeah, there's a note of that answer wiggle, um, what else
Yeah, I was just like a proof writing this one because like I I come up with a lot of weird stuff in my head
That sounds better in my head that it sounds spoken. So scott if you if you think that that's written properly then answer
Yeah, I come up some weird things
Ask the host if you want to hear my yak rap. I wrote listen to space. I mean dude if yeah, sure
Uh, uh, if if calvin, okay, so here's the thing if calvin can actually write out the song
And potentially even turn it into music i'll play it on his bass once
Like if you can make that whenever it's ready
I'll wait for as long as it takes and eventually we'll play it on a space whenever we get the chance
Why not be funny as fuck? Yeah, go for it
Um, yeah, let me post that thing. I've got two things
I'm going to put a little thing out there roasting a bunch of experts and then next i'll be roasting a political figure
Who's in on this anybody want anybody want to see that?
I'm going to do it anyways
I pinned it up at the top. The next one's going to probably be a bit of a risky thing but
You're gonna violate the neutrality
No, no, not really. Uh, i'm just gonna state an obvious fact
Which sometimes is is an declaration of war in and of itself, but
Yeah, sure, but first everybody go like and repost this thing
I have this thing. Oh boy
It's a it's a terrible post by the way
I don't want to reply with this. I don't know if that's a good idea. Oh, wow. Oh, wow. Oh, I don't touch that
I want to I really want to i'm going to touch it with the 12-foot pole. That's this argument that i'm going to do
Otis thoughts, what do you think that?
So if you read the yes, so if you read if you read the actual thing itself, then the context to that
Then i would just put that in there and just see what happens
Oh, I think I think thematically I think thematically it does make a lot of sense
Yeah, so she's skating on ten ice there. He's kind of beginning
A branch of government. Let's see. Uh, is he walking on the ice of neutrality?
He says no, I think kind of
I'm not i'm not roasting the concept of the person necessarily as a concept that also addresses other people
I'm just roasting execution
It's a difference that's worse
Well, uh, okay, you're probably safe
Yeah, I think about where you're going to be. You're safe. Yeah, I think i'll do it
I'll do it. You guys i'm about to do something politically fucked. Here we go
This is gonna be so stupid
I pin it up at the top of the space by the way for those of you want to check it out
Oh my god, why are people this makes
I can tell you why he can't repost his own spaces because he's not reposting his own space and at most he probably has a
shoddy internet connection or needs to restart his device
And literally the fucking space is there. I don't understand what the problem is
Something that that grinds my gears is when people say the platform is censoring them
If they cannot make a post because they somehow have not cleared the cache in their device since 2022
Like that's that's your problem. That's not it. That's not a platform problem. That's a you problem
Uh, do you know she uh went to the same university I did but uh about 10 years apart oh god
So did keith ullberman, so how's that for irony
Oh, he said it's ready now. I think it wants to rap on stage. I hope it's good
We'll see how it works out
I hope they were impressed. I've heard a lot of singing acts on stage almost none of them are good
So if that one's a good one, then that's a good one. Yeah, and also hui luis hui luis also went there
Huey luis is great. Does anybody ever does anybody still listen to that huey luis in the news?
I like that. I think it was um
Back to the future really popularized it for the modern age
Absolutely. Absolutely. Yeah in that space. That's what we were listening to for like an hour
And it takes you back in time it's very nice especially when he complains that michael j fox is playing it through loud
I'm playing that on the fucking cyber truck. I can't believe nobody else has done that
You play all your futuristic stuff and you're a synthwave and such but dude seriously
I want to hear your next level speakers full volume playing back in time by hui luis in the news
More mimetic everybody does a synthwave route
What's the first thing that everybody thinks of doing when they're in the truck do the opposite of that?
That's how you get a mean to go
Oh, I can just see this cyber truck shaking now bingo see see
There's interest you can feel the interest that's how you game the market by supplying to it
You remember how those speakers were thumping?
In uh in back of the future
Hey scott you you're gonna install the web bar
Have to get it first maybe by that time the light bar will come with it
It's original equipment. I think I think it stayed uh, like some states prohibited so they that's why they took it off
Yeah, it's always some party poopers, huh
But yeah like having vanilla tesla light bar, uh, even even even in the tesla store would be optimal
Yeah, that's my medic adrian
Exactly, I know what's my medic
I am so good at predicting what the market looks like based on the current trend lines that are formed based on my assessments
Oh my god, I must be rogue a gi
I know because it's already here and we don't know about it because it doesn't want us to
Well, did you see someone? Well, I think a lot of people did it. It's like, you know, how would you take over the world?
Uh said how it would take over the world and you start reading it and it's like hmm
Wait a minute deja vu. This is all but happening
It's been here it's been here
There's the thing take the playbook and you tell it to someone else and say that's what's happening right now
You have two things that are going to happen most of the times they're going to think you're crazy
There are times a person's mind is going to be blown. They're going to be crazier than you, right?
That's how it is. That's how it works
Nobody believes you because what you're saying is so ridiculous that it turns out to be true most of the time
It just depends on what it is. Exactly. There are obviously exceptions to this. Not all things are the same, right?
It's like if you say some crazy stuff like I don't know we're gonna have free energy
That's full of shit because you know
What do you think the universe is made out of finite amount of whatever you think is infinite?
It's called uh, it's called the thermodynamics it uh, it happened without your permission
Yeah, the best way to run a conspiracy is to basically make it out in the open
And then have a lot of people talking about it and then just deny it as being really crazy
If I was going to be running conspiracy, that's what I do
I wouldn't keep it secret but I would almost make it out there and then just say oh those people are loony
start in an alternative monetary system
Make sure it scales make sure it gets
Adopted by traditional institutions keep a very large position
Maybe start distributed AI network
Economic resources at whatever is most likely to collapse the system
Satoshi's show or an impressive government
Instability of reaching a point
I think we're kind of reaching a point. It's like you want to keep the space going or shall we
Shall we end it? That's the question
What does the populace demand
Would just like sit down start playing some Diablo not gonna lie, but yeah
Uh-oh. I'm just gonna say as the netrunner. I'm almost nodding off as it is. It's uh 1 a.m here
Yeah, saturday night interesting conversation
That's nice, yeah true. Yes the good one
Yeah, I think so, I mean it was a really good space
Hang on out anymore because he's offered at his silicon valley party without his vision pro
Well, I mean here's the same
Here's the thing some opportunity for utility. Uh, remember what we discussed in the space for those of you who have listened through the whole thing
Um, it got to the concept of discussing how life imitates art and how we should essentially just manufacture better art
I think we have the opportunity opportunity to do that right now
So i'm going to end the space the recording is available and for any of the eac that would like to say
I don't know spread more information about what best series this has said within this space just simply clip the space
And spread it that is the best thing that you can do the recording is available as always
I would highly recommend doing this on desktop and whoever has the most mimetic clips thereby then
Also gets elevated within that network of eac because i've seen that quite a lot of people within that community have created their own content
Based on whatever. Um, bef has put out there with young burden as we know him
So yeah do that. So I end the space today right here right now
You can go back replay it
Clip it because he's dropped a lot of interesting knowledge in here
Uh, also, can you find me the one where I reverse the paperclip thing? That was pretty funny. Anyways, thank you all for coming
I'll see you guys again tomorrow