Jackal Town Hall

Recorded: Jan. 24, 2024 Duration: 0:43:48

Player

Snippets

All right, all right, all right, can everyone hear me okay?
Can everyone hear me okay?
Can you hear me okay?
I can hear you buddy.
Right on, right now.
Marston, how you doing?
Doing well man, just trying to figure out how to get my headphones going on this thing.
That's a classic, that's a classic, all right.
Town Hall, Twitter, we kind of alternate back and forth between here and the Discord, which has been great.
Outside of that, a lot of different updates since our last Twitter Town Hall.
I think the last one that we did do was the update for the 2024 roadmap that we're going to be doing with the Java protocol.
With that, since we published that kind of like talking about in which quarter, what things are we going to be specifically spending our time and energy on,
we have finished our first one, which is ledger support.
And Marston, do you want to kind of talk about the road it took for us to kind of become like the first hardware wallet supported storage network with permission?
I think that's probably a good thing for you to do.
Yeah, for sure.
So, I mean, realistically, we can go all the way back to the chain launch.
We, you know, first started the Jackal network up, we had storage accounts online and ready to go.
And there's this really cool thing in the SDK called amino encodings, which isn't actually supported anymore, which is really cool, but it's how all of ledger works.
So there's two ways to send messages to the chains, which is a protocol buffer signature or an amino signature and ledger requires you to use amino signatures.
But everything in cosmos uses protos, because that is better, and you shouldn't use aminos because they're unsafe and bad to use.
But, you know, we built libraries around it to make them safe and good to use. That was like the whole struggle for, I don't know, what was it like a year and a half.
A full, yeah, almost, I guess a year and a half now.
I've just back and forth fighting with the ledger and, you know, for a while there we had priorities that were just like so much more important than ledger at that time, like that consensus fears upgrade, you know, all hands were on deck for that.
Ledger unfortunately got pushed to the side a little bit while we worked on making the protocol more stable and functional.
And then every now and then we'd go back to it and get a little bit closer and a little bit closer.
And eventually this time around, we were working on V4, actually.
And, you know, we were trying to make ledger work for V4 and we noticed, oh, ledger does work for V4. Why is that?
So we started tracing it back and we finally found it in this tiny little thing in the communication protocol between our chain and our JavaScript library.
So, you know, that was a lot of sleepless nights for Aaron, our integration specialist.
But we finally did manage to trace it all the way down to some mismatch between how the chain was expecting things and how the JavaScript library was default handling things.
And we finally got it. So now we have these wonderful, wonderful infinite USB drives, as we call them, which is really, really exciting.
It's been a long time coming, but you can now safely securely store your files to Jackal using a ledger so that your seed phrase never leaves your encrypted device.
And, you know, that really does make it the most secure storage protocol kind of possible right now with modern technology, which is really exciting.
Yeah, it's pretty fascinating that now you kind of have this, your ledger is now you're kind of access to all your money, whether you kind of see cryptocurrency as money, I think a lot of us do.
And now you can actually also use it as an infinite USB drive, which is pretty cool as well, right? You can go anywhere in the world with your private key, you can plug it in and have self-custodial access to your own private cloud environment.
It's just pretty cool stuff at the end of the day. And we think that we're just getting started.
We can start to think about peer-to-peer data transfer between parties, secure data rooms and peer-to-peer transfer and granular permissions leveraging this technology and having full audit trails while encrypted.
If you do have the permission to see it, it's much more granular and it's pretty fascinating in its own right.
With that, Jayden, I'll kind of let you kick it off on everything that's going on in the BD front, whether it's the web two space, whether it's the web three space, I'll kind of let you take it away.
Thanks for that, Mike Pass. Yeah, some exciting developments on the business development front. From a web three perspective, we've got some blogs coming out about some recent integrations, hopefully later this week.
We're going to be talking about the hands of our counterparts, just making sure that my T's are crossed and my eyes are dotted.
On top of that, our pipeline is full still with web three integrations waiting to take place.
So we've got some companies and some projects at various stages in that pipeline, which is always healthy.
From a web two perspective, really, really exciting news.
We're going to be onboarded a telecom company earlier this week.
So as of Monday, Jacko is now providing backups for a telecom company out of Iowa, which we love.
But other than that, to the future, I think we're going to look and see how we can replicate that process with the telecom company and make sure to keep that pipeline full from the web three front.
So exciting stuff at the end of the day, and we've been pretty, there's a lot of work going on in the back end for that one, but really exciting to have a real world business that's in telecom that was started in 1920.
For example, integrating with us, it's pretty beautiful. If they can do it, a lot of other businesses can do it too.
And yeah, it really, really, really exciting times just for the job report a call and real world use cases for basic things like backups, right?
At the end of the day, the Jacko Particle is built to be successful as we get more mini terabytes onto the system.
And for us to do that, we need to kind of broaden our horizons and accept any and all data. So exciting times.
Outside of that, Marston, do you kind of want to talk about things that are going on right now in development for kind of V4 and where you think that is at right now?
Yeah, so we got some good stuff coming down the pipeline, not technically directly related to V4, essentially what we have going right now in order to make V4 possible.
There's tons of migration from the current source providers to the new source providers to make sure that they don't lose all the files that are on them right now.
Because that's the crazy thing with upgrading something like a storage protocol is you need to make sure everything's still there.
You know, if we wanted to, V4 could be deployed, you know, in a month.
But what we'd be kind of risking is all of the network's files being deleted in one go, which is obviously not what we want.
So every time we do like a major upgrade, we obviously don't want to have that happen. So we have to do a lot of due diligence to make sure that those files stay where they are.
But we are upgrading to a new file format, which is really fun. The problem with that is all of your files are encrypted.
We can't actually do anything to those files. So what we essentially need to do is have this like in between phase where all of your files are being migrated to this new platform inside of our storage providers without changing the data or the security around that data.
Because, you know, we couldn't change anything because they're fully encrypted. And then from that, we have to basically transfer all the metadata to a new format.
And that new format allows us to basically check when like if your file was part of that old format or that new format that we're talking about, that is part of that old format.
When you go to use it, you can update it to that new format, which is really exciting if you would like, or you can just leave it and it'll continue to sit there as its old format.
But that allows us to do a lot of cool things, especially going forward. We won't have to do any of these crazy migrations. We won't even need chain upgrades to introduce new file formats, which is really exciting.
So what that means right now is the storage providers before V4 comes out, they're going to get a new upgrade. Those storage providers can go, they can upgrade all of their files to kind of have this like format 1.5 going on.
And that'll basically allow for the new V4 upgrade to basically just like run once they're ready to go. They won't have to deal with like a ton of manually running through their file system to make sure everything's good to go.
So before V4 comes out, storage providers get a new update, it should in theory reduce some of the bandwidth on the network right now, which is exciting.
So blocks should be a little bit smaller, how much we don't know exactly right now, but it should be a little bit, which is exciting for all of you node runners out there.
And then after that, we're going to be pushing on with the V4 upgrade, which is going to go live on the storage providers and the chain in one fell swoop to make sure that everything's good to go.
So that's really exciting. As far as the chain side of things goes, we just have a lot of like little tweaks left, but the econ model is fully implemented, and all of the new storage functions are fully implemented.
So that's really exciting. It's just a lot of like, you know, tweaking RNS to make sure that it is bug free because we've modified it a little bit to work with the new referral system and everything like that.
So little bits like that, we're just running through right now, which is really exciting because that means V4 is in a pretty good state.
And our timelines for Q2 are looking pretty good right now. So I'm really excited for that. I think that's it for V4 stuff.
Yeah, the V4 stuff is coming along pretty good and we're kind of making, we're hitting our checkpoints on time, which is wonderful for that as well.
The third thing that's kind of coming out in the near term would be radiant.
Oh yeah, radiant.
We've spoken about, we leaked a little bit of radiant last, I think it was yesterday or maybe two days ago on our Twitter directly where it kind of looks like Windows 95, which was kind of something that some of our devs took out of creative liberty, but apparently a lot of people like it.
So I think we have to run with it for at least a little bit anyways.
Maristadio, I kind of want to speak on radiant and the perpetual forever storage on a system like that and how it's going to pay one store forever, public by default, all that stuff.
Yeah, yeah, for sure. So yeah, I mean, if you didn't see our little tweet about the radiant user interface, you know, you can go check that out on the Jackal Twitter.
It's pretty funny. It's, it's a really funny enough, like we've been playing with it internally. It's a really responsive and good feeling user interface, despite how it may look.
But essentially, what it is, is it's kind of like a pseudo dashboard. So it's not quite, you know, that Jackal private secure dashboard that you're used to.
Instead, it's more for sort of like public data that, you know, you don't mind if anybody sees it's it's to be published on the open that a lot like, you know, IPFS or our weave in that way.
So you can kind of think of it in a similar way as like, I'm sorry, not IBC, NFT storage, which is the kind of file coin NFT sharing platform, which is really fun.
This radiant dashboard basically just allows you to go. And, you know, if you have files, you can just chuck them up there, can click to upload, you pay once.
You don't have to have a storage account at all. As I say, you just click that button, you sign your transaction, that file goes off, you pay a flat rate of Jackal storage.
It's the same price as the current offerings on Jackal. So that $8 a month per terabyte, but scaled down.
So if you're storing like one megabyte, it's going to scale itself all the way down to the price of one megabyte per month for 200 years.
So you're paying for that full 200 years. What that does is it guarantees that it's there for close to forever. I mean, longer than anybody will ever be alive, basically times two.
So, you know, that allows it to stay up there forever. If you want, you know, in 200 years, you can go and renew that file if you're worried about it.
But that makes it a really sustainable model for us, especially in comparison to something like Arweave, which promises like permanent storage.
But they make a lot of assumptions that storage is going to get cheaper over time and continue getting cheaper forever.
So what we, you know, us at Jackal Labs, we don't believe that, you know, storage is getting cheaper, but we don't want to base our protocol on this assumption that storage will get cheaper.
So that's a decision we made. Basically, what that does is it makes sure that our storage providers are incentivized for that full 200 years.
Instead of, you know, if storage doesn't get cheaper, they're only incentivized for 10 years, which wouldn't be too good.
So, you know, we're really trying to future-proof the system, really make sure this file is stored for the full 200 years that you expect it to be stored for.
And you can expect all that with Radiant. So this is really, really good for, like, NFTs. If you want to store NFTs, you can just throw a file on there.
It's up forever. You can't delete the 200-year storage, which is a really nice feature for NFT owners because they know that, you know, the host of that NFT won't ever go down.
And then kind of on a side note, the stuff powering Radiant is exciting as well.
That, you know, 200-year storage that you can't delete makes it really, really useful for things like AI agents where, you know, you're looking at an AI model that wants to have memory, let's say.
So, like, conversation data, if you're chatting with, you know, some chat GPT alternative, when you are done that conversation, if you're running in the cloud, that cloud has to store that conversation somewhere.
So in a decentralized world where you own your data, you know, you're hoping that those files that you're storing stay there and they can't be read by anyone.
So this 200-year storage allows you to upload private, like, conversation history that is stored for 200 years and you can verify it's stored for 200 years.
So, you know, if an AI agent posts it to Jackal on your back, you know that that AI agent can't take it down on, like, a traditional cloud system.
So you can guarantee that your stuff will be there for longer than you'll be alive unless, you know, some crazy tech happens in the next couple of years.
But yeah, I mean, it's pretty exciting for us.
It's a really sustainable way to store data for a very, very long time. And we built Radiant to be this really, like, fun, interesting front end to kind of showcase that and make it really accessible for you guys.
One more thing is just the Jackal Shepard gateway that we have been working on for a while is kind of bundled into Radiant.
So if you post a file, you can just click that little, like, view online button and it'll take you to the Shepard web gateway.
And then your file will be viewable there. You can share that link with friends. You can.
I've even done it where, you know, you can post it into Discord and Discord will just display it as an image because that's all it is.
It's just basically almost like a CDN if you're familiar with that.
It's a really easy way for you to share files with people around the world using just a simple link. So that's really exciting.
No, it is super exciting. The other thing that I'm excited about specifically is just publishing of documents, too, right?
So say that we want to create a blog and we want to publish it for a really long time.
We're writing a white paper or whatever we like to call this.
Sometimes people call them yellow papers or gray papers. It depends.
But having a way that you can have a self sovereign way that is not really.
It doesn't have the ability to have supply information, supply chain attacks.
It's extremely resilient when the areas of decentralized AI in the areas of decentralized publishing and just access to information in general.
I'm excited for it and for all those reasons and having kind of a little bit more of a multiplayer mode while the privacy is really fun.
And we have a ton of users that are using the JACO dashboard proper for that specific kind of like personal private Dropbox type product running on the blockchain.
I think this is wonderful for kind of something that's a little bit more multiplayer mode where you can do it with your friends.
You can share images. You can have shareable links and you can kind of have this for integrations with NFT marketplaces or decentralized social medias
or kind of Reddit clones or anything like that that are kind of playing in the blockchain meets social space.
Outside of that, there's kind of one last thing that's been really happening this week specifically in the kind of the intersection of JACO meets AI meets kind of cloud compute
where we've been really focusing on kind of the ability for JACO to have an integration with a company named Morpheus,
which is they're essentially kind of like an AI agents product and they're building on Ethereum and what they're building seems quite promising and we wanted to get involved as well.
The kind of like lowest hanging fruit for us for an integration with a product like that and kind of bringing value for those people that are looking to build decentralized AI applications in general is number one chat history and memory.
I also touched on this a little bit, but essentially when you kind of integrate with things, start chatting with a GPT type chat bot in general or really any AI in general,
most of them specifically the decentralized ones don't have memory and don't have decentralized memory or self-scustodial memory.
And that's something that JACO does really, really well is having the ability for AIs to store weights, to store chat history and kind of find other integrations as well where as we kind of move down the pipeline of what we have going on this year,
what we're looking to develop this year, you can start to think about AIs having their own storage account where you can kind of read and write to off-chain data, really exciting stuff in that area.
The next thing that we're kind of looking at as well is RAG frameworks.
So when you look at what a RAG framework is, it's kind of a retrieval augmented generation, which is kind of like a way to build like these lightweight, fascinating AI models that are kind of trained to specific data so it can have context.
So for example, you can kind of train AI back in 2021, but unfortunately it really only has memory up to 2021 and it doesn't have access to private files, whether you're a company that kind of has sensitive information that you can't really share with any third parties.
You are kind of limited with your ability to integrate with AI in general.
So this is a pretty far out for us, but it's something that we have on the roadmap is kind of having a way for Jackal to be a private vector database storage, which kind of gives the ability for you to kind of train these chat AIs where you have this unique ability where it can be trained on private data.
Things that you're, whether it's by law or information that you just don't want to give up and you don't want to introduce a third party as like a cyber risk or as a privacy risk for users or your company.
You can start to build these custom AI models, which is pretty fascinating as well. And that's kind of something that we're looking at integrating in there, kind of similar to a company called PyCon, which would be the centralized version of that, but it's just another thing that we're working on right now in general.
Marisyn, I know you're not the AI favorite of the team, but your thoughts on the AI stuff in general.
Yeah, I mean, I think the AI integrations with Jackal are like, I mean, you know, I think Jackal is one of the best places to store AI data, especially with how sensitive and how valuable AI data is right now.
It's, it can't really be overstated about how much money is going through AI training models, AI user data. And I mean, like, I can only speculate that companies like Google and chat GPT are taking, you know, all of your conversation data and using it to make their AI better.
I guarantee that's, I almost guarantee that's somewhere in their terms of service somewhere. But with something like Jackal, I mean, the whole ethos is you own your data.
And if you are talking to an LLM, you should still own that data, you know, the LLM provider shouldn't own your conversation, especially in this decentralized landscape that we're trying to build.
We're trying to build these like ethical tools. So companies like Morpheus doing AI agents, where everything's backed up to Jackal, I mean, nobody owns those AI models, nobody owns your conversation except you, like that is the kind of stuff that we're really working for at Jackal Labs 100%.
I think like the Jackal protocol, there's no better place right now to store sensitive AI data, like just without a doubt.
Yeah, it's kind of the AI developers have two doors right now, really, right? So you have door number one, where you kind of have to invest hundreds of thousands, if not millions of dollars into infrastructure that you would host locally if you want to build something that needs to have privacy,
or needs to kind of have a really high security posture around the data that the AI is using, right? Door number two is you kind of integrate a third party, which can kind of be this centralized choke point of failure number one, or also the cybersecurity or privacy risk to the data itself, right?
And it's pretty fascinating that there's kind of no middle ground. And I think that Jackal can really be that middle ground for people that kind of want that scalability at a reasonable price point, and also have the ability to maintain full self custody and privacy around that data.
A few months ago, I have a friend that is a partner at a consulting company where they consult heavily for the Canadian federal government, for example.
And they were running into issues where they wanted to build LLMs to increase the productivity of their teams internally at the Canadian government.
But the issue is that for them to increase the productivity, they needed to kind of integrate the training AI on private data in general.
And they had no way, and they continue to have no way to do this, because if you want to leverage kind of one of the best models right now, for example, GPT-4, and you want to integrate the API,
technically you're giving a third party access to all of the private data, and by law, they can't do that.
So I think for hopefully getting closer to a stage where you can maintain privacy and security will still build scalable AI applications, and kind of not waver on the security privacy and for people building decentralized AI, the decentralization factor either.
That's kind of everything from me right now. It's been a long week so far, but I think we have one request that I'd love to bring up.
If you can hear me, feel free to chat right now.
Sorry, I raised my hand. Are you speaking to me?
Yeah, exactly.
Sorry, thank you. Just a cut off at that moment, so I didn't know if you, I didn't want to just barge in. Sorry to kind of come in. I kind of found out about Jackal Protocol from CryptoSailor, who's listening, who I have a lot of respect for.
I kind of will put Jackal in, it might not be, but in a group where there's Render and AKT, Akash, kind of in that group. I don't know what you want to call them in that group, but decentralized, something processing and storage together.
My question is, this is why I keep asking Akash and AKT, but they never answer. So now I've got you from Jackal.
What is the relation between your token and what you do? Because there's a lot of tokens where they just are a token to represent a company, but they actually don't do anything.
So therefore, they don't grow, your investor doesn't earn anything. For example, again, AKT, they've got processing powers and cheap processors where you can rent them out, but the token itself doesn't get any value out of it.
So how do you get value for your token, the Jackal token out of storage? That's my question. Thank you.
Yeah, that's a really good question. And we can go really deep into the economics. And luckily, we actually have our new economics upgrade that's coming out in kind of early Q2 with the V4 upgrade.
And I can link that, I can send it to you directly on VM as well. So you can kind of take a look at that. But in short, the difference between render and Akash and Jackal is render and Akash are kind of like compute marketplaces, where there's a multi-sided market and the value of their token is kind of the medium of exchange between the buyers and sellers of compute power, basically, right?
The difference about Jackal for a number of different reasons is, number one, Jackal is it is the medium of exchange for total for storage, right? So that's first and foremost is users need to pay in Jackal tokens.
In the event that we start to take other tokens in general, users are going to take the USDC buyback Jackal tokens and then pay the protocol in Jackal tokens. That's number one.
Number two, the other things that the Jackal token does, it's a incentive for storage providers and validators, right? So for example, validators, their job on the network is they secure the network, number one.
Number two is they're kind of like the police of the storage network, where they make sure that every single storage provider continues to have the data that they have been in exchange for that work, they get rewarded with Jackal tokens as well in block rewards.
The kind of the third thing that the Jackal token does is kind of takes care of the storage providers as well. So essentially, when someone comes into the network and they need to purchase storage, essentially, is they pay $8 a month per terabyte to the protocol right now.
And then they get rewarded with a, the storage providers get rewarded with real yield from that payment. So currently, right now, the real yield going to the storage providers would be about 35% of the $8.
And the rest would kind of go to protocol on liquidity and referral commission if there's referral commission or back to the stakers. So the economics of the protocol, the way that the token gains value is the more unique terabytes of storage that is purchased from the Jackal protocol drives the economic growth of the Jackal protocol.
I don't know if that answers your question, but essentially, the more storage that we sell, the more demand for the Jackal token will be.
It does. It absolutely does. No, thank you for the answer. Another one point which I feel people haven't solved. So if you go back to the original Bitcoin-wise situation, it's all peer-to-peer, P2P.
The issue with everything that now that's coming up, and a lot of us in this space are non-developers, non-programmers. We come from all walks of life.
And I would like to invest in this token, but I also would like to be a validator or a node or whatever you want to call them. So for example, if I wanted to say I want to invest in Jackal, but I also want to feel part of Jackal, I'll need to then run some storage.
I'm looking now on Amazon how much to buy a one terabyte external hard drive. You can get them for $45, $50. And let's say I've got a thousand pounds, a thousand dollars, and I want to just buy 20, 30, 50 of them and run them.
Nobody is telling us how to do that. It's just become a select group of you devs and you. I mean it generically. Nobody's given us how to do it for a beginner. Here's 10 hard drives, how to link them up and become part of your network.
While with Bitcoin, there is these easy steps where you can just run the node on your computer. You're not going to make any money, but you are part of it. Does that make sense? Nobody's done it. AKT, render, any protocol. Go have a look at it.
Very rarely you find step by step how to become part of the network and earn like the devs or earn like the nodes.
Yeah, no, 100%. So that's kind of something that's a little bit difficult because like right now in the early stage of the Jocko protocol, not that we're trying to keep anyone, but we have a really awesome storage provider community just in the Discord.
If you ask a question, probably four or five people are going to come and just help you out. If not, we will come and help you out to make sure that your onboarding experience is wonderful because for this protocol to be successful in general, we need the supply side and the demand side of storage.
And we can't really just there's not really like a slight few group of people as we need as many unique terabytes as possible for us to go and sell to other users, right? So that's that's number one.
The kind of there's three ways that you can get involved with the Jocko protocol and each three of them kind of come at different technical difficulties.
Way number one, which is the easiest and this is kind of coming with the with the upgrade that's going to be happening in Q2, is any user with their RNS name. So if you go to the Jocko dashboard, you spin up your storage account, you'll get an RNS name.
This is just like the native name service on the Jocko protocol. So it makes life a little bit easier. So you don't have to remember like Jocko one really long string of letters and numbers.
You can just have net everyone dot RNS and that is your RNS name. With that, that is number one, a way for people to send files to you and do all that kind of stuff.
But number two is that's also just a referral code. So if you have a business or you want to kind of find a way to get integrated with the Jocko protocol, the easiest lowest hanging fruit without technical knowledge in general is really just going to your friends or business and using that referral code.
And that's a good way that you can kind of drive, be involved as a community member, drive value to the network and also get rewarded for providing that value, right? So that's kind of number one.
Number two, number three is validators and node providers, storage providers. So for validators in general, that's probably like the easier to be a validator than a storage provider.
I would say at least Marcin, I don't know if you would agree with that, but being a validator, it's pretty much just kind of having the technical knowledge to kind of spin up a server and a data center, integrate with the protocol, go through the docs and kind of understand a GitHub.
But kind of by default, you need a little bit of technical knowledge because we're dealing with technical stuff, right? Not that it's meant to be a barrier to entry.
It's just the fact of the matter is with the stage that this tech is in, it does take a little bit of prerequisite knowledge of kind of running those kind of things.
And then the source providers is kind of very similar. You can kind of use an old gaming computer and hook into the network, number one, or you could like rent a server in a data center and then hook it into the network and provide it that way.
For onboarding for that, honestly, the Discord is the best way to do that though. All you have to say is like, hey, I have this hardware, or should I buy this hardware would probably be a really good question to ask before you buy anything.
And then you'll get really honest feedback from whether it's the Jocko team, from Jocko Labs directly, or it's other individuals that are in the community being storage providers.
No, thank you very much. I'll explore your website because it's actually the first time I've opened the website itself and it does have a few frequently asked questions and how to provide storage.
I think that's generally most of the things I wanted to ask at this moment, but yeah, thank you for letting me speak and I'll disconnect the mic.
Also, thank you for asking these questions. And if you have any issues, whether you want to be a storage provider or anything like that, please let us know because it's really important for like our build, measure, learn feedback groups going on internally so we can just make a better product and kind of be a better organization in general.
So any feedback, please share with us.
Thank you very much.
All right, we have one other person coming back up now.
Let me see if I can.
First speaker is it just me or fourth speaker is that just me right now. We might just be getting stamped with this I don't know.
I feel free to ask your questions if you can hear me right now I'm not sure if you're fully up yet though.
Is it just me that sees him as a speaker is it just, or is it just my know I can, I can see him.
He's just muted.
If you want to ask a question, feel free. If not, we can start to line this down unless there's any other speakers that do want to come out.
I'm not sure if it's me glitching out or someone else glitching out or see the travel account.
I don't know. I mean, I don't see any of the speakers anymore.
Alright, anyways, if anyone has any last questions feel free to come to the discord telegram or tweet on us, whatever medium of communication or stuff for you but outside of that thanks everyone for an awesome town hall this week if you have any feedback for us just let us know and we'll
see you next week's you're on Twitter.