Watch

Transcript

Disclaimer: The transcript that follows has been generated using artificial intelligence. We strive to be as accurate as possible, but minor errors and slightly off timestamps may be present.

YosemiteSean (00:00):

Welcome, welcome to the Generaitive Community AMA. We’ve had a lot of developments lately and we know you guys have had a lot of questions. So we wanted to take a minute to let the community gather together and be able to give you guys some updates. We’re really excited about, and let you guys have an opportunity to speak directly to the team. So we’ll give that another minute there to get things going, but great to have Cob and ZK in here. We’ve got Josh watching patiently as well, helping out monitoring Telegram, making sure any questions can come through there. So yeah, thank you for coming out, everybody. We’re excited to have you and be able to put out some more updates for you.

cob analyst (00:40):

Yeah, thanks everyone for joining. It’s great to be back on our own space and in the last couple of weeks, we’ve been out in everyone else’s spaces, everyone else’s AMAs. That’s quite the campaign set we did there, but yeah, nice to be back and have you all here on our own AMA.

YosemiteSean (00:56):

Yeah, it’s pretty nice to be able to host one of these ourselves, have more of an intimate setting too, where we can really take time to dive into the new points of the project we want to be able to expand upon and really just get into the nitty gritty, you could say, of the project. So I’m really looking forward to being able to share more with you guys. Also, just while we’ve got a minute here, really looking forward to seeing more of those horrific and amazing hand submissions for the hashtag GAI hand contest. We’re really surprised that you guys have actually been able to come up with a few that are possessing the right number of fingers. So please send in whatever you got, the crazier, the better submissions for that close in about two hours. So thank you guys for your participation there. And it’s been honestly amazing seeing just how those contests are going so far.

zkbrain.eth (01:54):

See Ashwini here. Ashu, how’s it going? Yeah, Winslow, appreciate the support in the community. Probably looking at maybe just one more minute, let some people kind of file in here. And we’ll just get going.

(02:12):

All right, this is gonna be like a bit of like a, kind of more of like a fireside chat, just kind of updating the technicals on the project. Still incredibly excited. We’re just actually a month in here. And yeah, just absolutely great growth in our community, especially in Telegram. I actually love the participation, the engagement we’re getting with the contests. We’re gonna actually announce a slight change to the contest in a few weeks. What we’ll do is most likely, and the reason why we’re doing this is actually, we want to kind of build some more utility out on the platform. But what we will be doing is actually having the daily contest, but 50% of it will be paid in ETH. And the other 50% will be paid out in GAI. What that will do is, obviously the winners are more than welcome to kind of do whatever they want with the GAI. We absolutely hope that they will hang on to it from a utility standpoint, short term, so that they can enter a larger weekly contest. And it will be same format, creating art, there will be a weekly theme, time to actually kind of make those submissions. And it is only going to be open to GAI holders.

(03:49):

Also to that, I wanna kind of drop a slight teaser here that we will be also launching our own AI-generated NFT collection, which will have some early whitelist access for GAI token holders as well. So now that we’ve kind of got some people kind of filing in here, I do wanna kind of say, like everybody, you know, like, do your own research.

(04:21):

I still feel quite confident that out of all the AI crypto intersection projects, out of all of the projects out there, we at this point in time still have the most amount of real Web3 utility. I’ve been seeing a lot of organic sales as an uptick in terms of art being sold on the platform. Our numbers keep on going up on the submissions and creation of those images. We’re well over 60,000 images created on the platform. So there’s some really good positive stats on the usage and utility here. And what we wanted to do was just hold this AMA. I know there’s a lot of questions out there. I absolutely love the passion and the appetite that the community holders have to find out more and what’s next for this incredible platform. So I wanted to kick this off, just let Cobb here kind of just dig in to some of the technicals and what to expect in the next little bit.

cob analyst (05:30):

Yeah, let’s actually start with some dev updates of a few things that either have already been released in the last week or are being released later today. First one, some of our more astute users may have noticed that when you’re generating an image, there’s this little advanced dropdown with all these options, all these configurations for models. We’re actually going to open that up as of today. So what that will allow people to do is really customize the way that an image comes through, just in terms of basically all the under-the-hood operations that the model’s actually doing and allowing people to configure and customize that. It’s also going to open up just better prompts because we’re enabling the negative prompt field. So what that’s going to allow people to do is put things in there like no extra limbs, no extra fingers, and really fine-tune the results that they’re getting. And that will be available across all of our models. So that’s a pretty big update in on itself, but we are excited to see what people do with that. Moving on, a couple of, like, I’d say quality of life improvements.

(06:50):

We’re actually simplifying the way that models are generated and we’re actually simplifying the way that our token URLs look. So if you’re trying to share the links around on Twitter or elsewhere, you may have noticed they’re quite long. Linking to a token, beautiful token you just put up for sale. So we’re changing that. So they’re going to be a little more pleasant to look at. I think that’ll have a huge impact when sharing them across socials and sharing them in Telegram. It’s going to be great.

(07:24):

On collection pages, we don’t really have the ability right now to see which Twitter account is associated with that. That’s changing. So now you’ll be able to see the original author, actually, their Twitter account kind of directly there. We have some improved error messaging on the image generation page. I know not everyone sees that often, but for those who were seeing it, it was kind of quite confusing sometimes. So we improved that. Let’s see what else we got here. Some small UI fixes, some standardization of some buttons, making things look a little bit better.

(08:05):

For those of you who’ve noticed as well, in the last week we rolled out some initial transactional email functionality. We’ve added the ability to resend verification emails for anyone who might have some issues there. On our homepage, we’ve changed the recently listed items to now display names rather than descriptions. It makes a big difference there, kind of cleans it up a little bit.

(08:33):

We’ve got a few other small discrepancies on our UI, like on the offer modal. And there was some UI issue where when you manually disconnected a wallet, you had to reload, but that has been corrected. So lots of little things. Teams actively fixing up a bunch of issues there and yeah, some new features as well.

zkbrain.eth (08:55):

I’d like to kind of add, the cadence is still going to stay the same. We’ll provide like weekly updates, but in terms of the impact of what you’ll see on the platform, we’ll probably go into more of a like building development mode. So what we need to do is tackle some bigger tasks. Now that we’ve launched, addressed all of the main kind of usability bugs and just some strange like UI UX issues, we’re going to now kind of switch into a cadence where we’re actually kind of going to be tackling more like larger, complex feature set for the actual Generaitive platform.

(09:36):

So we kind of mentioned that the model provenance, us getting some model creators on board is absolutely critical. It is part of our next step in terms of proving out our thesis. So what we will be doing is providing the updates, but obviously in terms of touch and feel, it’s going to be a bigger task where that won’t launch on a weekly basis, but we’ll still continue to provide more usability and features. So yeah, just concretely, you’ll still hear weekly what’s going on. What will still ship may not appear on the front end system on the same kind of weekly cadence.

cob analyst (10:21):

On the topic actually of model creators and let’s say model upload, we are at a stage now where we are ready to accept user generated or artist made models. It’s still a manual process at this point, meaning it’s going to require some interaction from the team to help get it onboarded to the platform, but our infrastructure and all the underlying tech is at a point where if someone’s got a model in hand, we are ready to feature it on Generaitive. The way I see this going and really the team’s hope here is to actually get artists to release their model exclusively on Generaitive for a period of time or just in general. And I think that would be very interesting. There’s a lot of, every day, there’s probably 40, 50 models published or updated. We’re going to be working with some artists and get there. We want it first on Generaitive. So that’s, you can expect to see that coming in the next few weeks.

(11:23):

And actually with that, it’s an all in all call. Again, if anyone here in the audience has a model that they would like to be featured on Generaitive, please reach out. We will work with you to get it on there.

YosemiteSean (11:35):

I know a few people already reaching out to me about that. So I’m pretty excited to be able to get them on board.

cob analyst (11:44):

Let’s see, what’s next on our list here? I think we should talk a little bit about some updates in AI generally. We saw Adobe in the last week release Firefly, kind of their take on generative AI and the tools that they’re kind of including into their creative cloud suite. It is early beta at this point.

(12:06):

It’s quite interesting, their approach in a few respects. The biggest one being that it’s not built off of a stable diffusion or any of these other examples of stable diffusion or any of these other existing ones. They’ve kind of done it entirely in-house and done it from their stock photos library, which is quite interesting. It shows that you don’t necessarily need to have a huge internet wide data set to produce a good model, which is good news. We expect to see artists be able to make better models based on their own images.

(12:44):

One other interesting thing in terms of compensation is they actually are compensating people through their existing stock image licensing agreements. So it’s really nothing new there, but they’re one of the first people, this is great validation for us, the first organizations to acknowledge that, hey, the people contributing to models need to be compensated. So that’s a very good validation for us in what we’re approaching here.

YosemiteSean (13:14):

Always great to see other like minds, especially like minds as big as Adobe.

cob analyst (13:22):

Yeah, it’s huge.

YosemiteSean (13:24):

Well, with that, do you want to dive any deeper into further updates or do you want to take a second there to start jumping into some community questions?

cob analyst (13:33):

I’ve actually got quite the list of things here still to go over. I know Asher keeps trying to connect here and doesn’t seem to be able to get up as a speaker, still working on that. I imagine he’s got a question or two for us.

(13:46):

Let me see if I can help him out there. Yeah, see if you can help me with that. In the meantime, the next thing, this is a big one. Everybody’s asking, everybody wants to know, when white paper, right? Soon, very soon. In fact, the team is now confident that we will have that released within the next two weeks. And because we’ve said it, you know it is true. So please expect that it is coming. That being said, we do have some of the details from the white paper that we are ready to share today. I know that is very exciting. But first, I think we actually wanted to talk about that 30% of locked tokens and what that’s for. I know we haven’t really said much about it, except that it’s for team and community and stuff. It’s for team and community incentives. I think that’s the extent of what we said so far. But we have some more details for that. So some of that is allocated, it’s probably about 5% or so for centralized exchange listing.

(15:01):

5% of that is allocated for staking rewards and community incentives. The remainder percentage is allocated mostly for strategic partnerships. Again, we’re going to need to be working with a lot of other organizations here to make our total vision happen. So we will need to allocate some tokens for new members of our team. And then the remainder, of course, is for the existing team and our in-house team here. And how we look at that, you want us to have a bag here as well. So there’s a good percentage there set aside for the in-house team.

(15:50):

Mentioned staking. I don’t have the full set of details I want to share today, but I do have some. It is some new information. I think it’s going to be interesting. I think you guys are going to like it. So really, it’s a two-tiered system. So we see that as people who have GPUs and people who don’t. Now, in the respect of how we’re going to release this, the non-GPU staking, so that’s holders who just have tokens, will be released first.

(16:24):

Not going to say when that’s going to be released, but soon. And what that staking will do is it will allocate the platform fees and other fees that we are collecting from minting, from secondary sales and whatnot as rewards for those who are not running GPUs. Now, for those who are running GPUs, that will, of course, it’s coming a little bit later in our roadmap. However, what I will say is those users are providing useful work. So the expectation is they will, of course, be rewarded more heavily based on the work that they are doing, picking up jobs. So as somebody who’s putting up an AI workload, so that’s an image or it’s a piece of music or it’s a piece of something else, whatever that might be, maybe even some surprises there. They will do the work, post up the work, and they will be paid in GAI to have completed the work. Of course, there’s some validation and some other metrics behind that to ensure the work is done properly and correctly. And yeah, they’ll be rewarded based off of the jobs completed. And then, of course, off of the jobs completed. There’ll also be a reputation system because we don’t want bad actors. We don’t want people coming in and messing up our generation. So we’re aware of that as well.

(17:60):

And one other interesting thing is if you really like a validator, or if you, sorry, if you really like a tier two staker who is doing some great work, doing some great generations, you can also delegate as a non-GPU staker, you know, kind of a tier one staker. You will be able to delegate your stake to them so that you can earn and kind of tack on to that part of the rewards there. As you can imagine, the more staked behind a worker, the better. I think that’s all I wanted to cover today on that.

(18:43):

Like any staking system, it’s important to have some good buy and sell pressure. I know there was a lot of discussion over whether the rewards would be an ETH or a GAI. And for me, it’s kind of the same thing. As long as there’s adequate buy and sell pressure, one can be exchanged for the other. But obviously, the native currency here, we want GAI. ZK, I miss anything?

zkbrain.eth (19:13):

No, you’re rocking it. Just keep going.

cob analyst (19:16):

I’m getting a question. Someone actually directly added me. So it’s a timeline question. I am hesitant here. I think ZK, keep me honest. For tier one staking, I don’t see a reason we… So again, tier one being non-GPU staking, I don’t see a reason we can’t get that out within, say, 30 days. Does that answer your question?

zkbrain.eth (19:42):

Yeah, I’d concur with that timeline.

cob analyst (19:45):

Well, that’s exciting. 30 days, there we go. Tier one staking. Let’s fucking go.

zkbrain.eth (19:50):

Yeah, let’s go, guys. Come on. There’s the excitement in the room. Let’s go, guys. Woo!

YosemiteSean (19:57):

Well, gentlemen, that’s a pretty big question that I’ve been hearing people ask, so that’s pretty awesome. And honestly, most of the questions I’ve already had lined up here have been answered. You guys have been absolutely killing it. We’ve already addressed when will user-submitted models be ready now? When will the white paper be ready? Almost immediately. All this is for you guys, so I’m excited to see what you guys have to say about that. Are there any other killer updates you guys want to share, or should we try to pick out a few more questions or even a speaker from the audience?

zkbrain.eth (20:31):

Yeah, why don’t you filter through that, see if there’s anybody that wants to come up. And I guess the big takeaway here is kind of like call to action. We’re only going to succeed, as a team, but from a decentralized standpoint, the community is a big part of our success. And yeah, we couldn’t do it without the community support here. I would say that the big takeaway is just go out there, just spread the message, have people kind of come on board, take a look, see there’s really no commitment on using the platform. Our numbers are growing there. That’s super important. And yeah, do your own homework. I mean, sure, yeah, you can, there’s a lot of AI crypto projects that launch on a daily basis, but ask yourself, which ones can you use right now? Which ones have an actual utility where it actually has a genuine intersection with crypto? I mean, there isn’t any, none that I can see. I’ve seen some other projects with a loose tie-in, but it doesn’t really encapsulate the true meaning of Web3 or at least the items that I believe in, which is decentralization, provenance of people’s work, creators having self-sovereignty. I mean, those are really big pillars and I firmly believe them and we’re just going to continue to ship. So there’s not going to be a major change or shift in terms of our strategy. It is go out and continue delivering on a product and not just hype. So yeah, that’s kind of all I have to say in terms of call to action from the community. And yeah, that’s what we kind of have to do is rally together.

(22:33):

We’re divided or anything like that. We’re going to be in trouble.

cob analyst (22:38):

Yeah, I just wanted to add as well, in terms of like token utility, we’ve released a lot of features, new models and other things in the last few weeks that I think we could have easily thrown up a token gate in front of. And ultimately, we as a team, we decided not to.

(22:55):

And the rationale, of course, being that the platform as it is being open and that anybody can come in and use and try it. And even, we had talked about generating images without even a wallet connection, right? I think our approach for the time being in not token gating things within the platform has been correct.

(23:20):

That being said, we do have a couple of features planned for the next two weeks that will be our first features that are token gated on our platform, providing holders some utility that I think is going to be interesting. And I just wanted to reiterate that we could have done that with our other features and we have the technology. Everyone can see their GAI balance on the platform. Platform knows how much tokens you have. But we decided not to, right? And again, for the purposes of bringing as many people in the door as we can.

YosemiteSean (23:57):

Super cool hearing about more features upcoming, a little more utility directly for the hands of our holders. I actually do have another question in regards to an earlier point made. I was reached out to by a model creator and they were asking if model royalties will be live as soon as models can be submitted.

zkbrain.eth (24:20):

Yes, absolutely, that’s a core feature and a tenant and pillar for the reason of our existence. So absolutely, we manually are onboarding model creators, developing it ourselves internally, and any model creators that are ready to go, reach out. We have a manual onboarding process and internally, we would absolutely love to be able to kind of do some testing as well so that the timing is right. And that’s why we’re announcing this.

YosemiteSean (24:55):

That’ll definitely make them happy to hear. I’m sorry there, Cob, go ahead. Didn’t mean to cut you off.

cob analyst (24:59):

No, not at all. I was just gonna take the conversation in another direction on something we touched on at the start of the call. And I don’t think we said enough. So if that, anything else there, I’m happy to hear. Yosemite.

YosemiteSean (25:12):

No, nope, that was it. Honestly, the feedback I’m hearing already on the model submission being more or less ready right now is already pretty crazy. And I’m sure that that answer has got them riled up as well. So thank you.

cob analyst (25:26):

I’d love to hear that. So we said kind of at the start there that one thing that’s coming up as well is an exclusive AI NFT collection, part of the early like whitelisting, again, for being for holders only. I just wanna say, I’m not gonna say what the collection is in terms of the art style and whatnot, but that it is something internally as a team we have developed. We’ve been working on it for a bit here. Actually, probably around the same time that we started working on Generaitive, we started talking about this as an AI-based NFT collection. But just it’s something that is going to A, showcase what’s possible with our platform and our technology. And B, I think it really is an interesting take on what an AI NFT collection can be and potentially how it gets minted out as well from a model. I think that’s something that there’s a difference between just an AI NFT collection that someone just kind of run it through AI and then actually maybe releasing a model specifically for an NFT collection. I think that’s something that’s potentially very interesting, especially when you throw in the royalties aspect of it. So yeah, more to come on that. I don’t have any firm timelines on when that will actually drop, but you can expect to hear more details on that in the coming weeks.

(27:02):

That’s just something as well that I don’t even think we put on our roadmap. I’m just looking through it now. No, it’s not even in there. Oops. But yeah, it’s interesting.

YosemiteSean (27:13):

It’s okay. It’s a bonus to the roadmap. It’s an extra special surprise for everyone who’s along for the ride.

cob analyst (27:21):

Yes, something else too that I think we saw in the last week, I don’t know if we said it on another space, but user D-Y-O-R kind of had that, it was launching a Skull kind of clothing brand. I don’t want to misrepresent what he’s trying to accomplish there, but really what we did there is built a contest themed around that and he was doing an additional giveaway for the winners. And I really want to thank him for that. I really want to encourage people, if you have an idea for a contest, something that you’re working on, even a side project that you want to collaborate with us on a contest, absolutely reach out. We love that collaboration. We love kind of open sourcing it and allowing anybody who wants to show something of their own and go, we’ve got Near in the audience as well, same thing with his collection that he launched a few weeks ago. We absolutely love it. The team has a willingness to help you get it out the door. So please, please reach out and we’ll make it happen.

(28:24):

I’ll just say with that, if anyone wants to come up and ask some questions, please do, we’re kind of halfway past the hour here and I just want to open it up. And if there’s any other questions there, Yosemite that have come through on Telegram that I missed or here in the chat. Yeah, let’s get them going.

YosemiteSean (28:42):

So far I’ve filtered through my DM questions, but if anybody in the room here has a question, throw up your hand, request to speak and we’ll get you up here to ask you a question. We’ve got Garlic Josh as well in the TG watching for any questions. So if you’re a little nervous to speak, go ahead and shoot a question in there and we can get it over here. Well, I see you waving Jose. Are you trying to come up here and speak? Just shot you an invite there, sir. You’re welcome to come up and ask your question.

Wizardly Prompts (29:12):

Hi, just a introduction. I’m Wizardly Prompts. I currently sell prompts for a living. So right now that’s what I’m currently concentrating on. I just want to know if there’s an opportunity like within this platform to like sell prompts as NFTs or special prompts. Cause there are some advanced prompts that generate specific imagery. And I just feel like that might be a potential thing to include into the platform. And I do see that there are some, and if like there are some platforms that are adopting that strategy or that thing where they sell prompts as NFTs, especially since a lot of the complaints that we get from the current platform that I’m on, it’s just prompt base. We had a lot of complaints that there are countries or there’s like entire countries that cannot get access to our prompts because they don’t have a way to pay for our prompts. So basically that was just like my question. I was wondering if that would be a possible in the future.

zkbrain.eth (30:24):

Yeah, absolutely. That’s the whole point. We’re kind of looking at not only imagery, like the text image, that’s just the start. I’ve felt that that was something that a lot of people could relate to really quick, but we absolutely are kind of building out the mixed media, having support for is not a big technological stretch by any means. In other AMAs that we’ve been on, that was one item that we were addressing and believe there’s a ton of opportunity here. And that is, people are worried about like AI kind of taking over jobs. And I believe it’s short-sighted. There absolutely is more careers and jobs that are gonna be created out of this new technology than ones that it’s gonna displace. And prompt engineering was on that list. So absolutely, I’m glad to hear that. And Jose, I’d love to connect with you offline. So if you want, just please do me here after the spaces and we can kind of figure out a roadmap and make sure that you’re able to kind of not only get credit for it, but also compensate it. So we absolutely will be supportive of your journey, not just here on Generaitive, but your AI one in general.

cob analyst (32:01):

Yeah, just to add to that as well, with it being in web three and potentially sold as an NFT, there’s interesting opportunity to use cryptography to actually mask or somewhat obfuscate the original prompt that’s being used. It depends on the use case, why would someone actually buy a prompt? But if that’s kind of intellectual property, there are ways to potentially feed things into a model and you’re not necessarily disclosing it publicly, but someone can use it on either a per use basis or pay for it on a per use basis or pay some kind of royalty. So there’s interesting business models that we could set up there for people to use your prompts without necessarily disclosing them, which could be, again, I don’t know if that’s of interest to you, but it definitely would be interesting to the industry.

YosemiteSean (32:58):

Wow, okay, I gotta say, that was a pretty bomb question and honestly, very insightful. Really cool to see how there’s additional markets where kind of, or maybe additional aspects of the market we’re able to expand into and people already thinking of ways to use the platform that we haven’t even outright proposed ourselves. We do have another question coming from the Telegram here. So Telegram question is, is there anything going on, our favorite question, anything going on regarding marketing campaigns when marketing?

zkbrain.eth (33:33):

Yeah, absolutely. There is with our own particular strategy that I’ve mentioned on other AMAs before. Platform growth, obviously, we’re gonna do that through that growth. Cobb had mentioned, there has to be a reason for they develop like an ecosystem. Otherwise, it just becomes like a meme coin and you’re just like pumping the meme coin without utility. So I’m gonna be absolutely crystal clear.

(34:01):

We are developing an ecosystem so that it’s a healthy long-term product. Not a pump and dump. So, there’s a lot of meme coin projects out there that this is not one of them. So that the marketing strategy is being developed with Jay at 256 Marketing. We are also gonna be at NFT Miami, one of the largest shows there. There’s an absolute massive artists community that’s gonna be necessary for this to be a long-term success. And we’re razor focused on that. So, there is and has been a continuous marketing spend and strategy. By all means, follow along with our Twitter, where all of the announcements go up there. And we are constantly working and addressing that solution. I think I’d call that a pretty solid response to one marketing.

YosemiteSean (34:51):

It looks like we’ve run dry on Telegram questions. Would anyone else like to come up and throw out their questions? To come up and throw out their own question live. Throw up your hand or send in a request to speak. All right, but not everybody at once though. But I mean, realistically, we’ve provided a pretty in-depth overview here. So, people might be a little strapped for questions. Are there any other comments you wanted to throw out there?

cob analyst (35:20):

Yeah, I mean, just taking a step back and big picture, what’s the opportunity here, right? Like, we keep coming around different numbers for how big AI is now, how big AI is going to be. You know, what fraction of a percentage does a single product need to own to be a just absolutely huge business, right?

(35:44):

It’s a surprisingly small percentage of the entire industry. So, you know, just keep that in mind. Again, like CK was saying, do your research, look at other platforms. And I say, if we can even call them platforms and just, you know, realize where we are.

YosemiteSean (36:05):

No, thank you.

zkbrain.eth (36:05):

Oh, sorry, sorry, there’s CK. It’s all good. To kind of like just put it in perspective, like we’re just scratching the surface. I mean, these proprietary AI systems that are out there, like they’re, the industry as a whole is like $300 billion. It’s only gonna grow from there as more utility, more plugins, more capability is being launched. We’re so absolutely early. And it’s just so incredibly exciting to be at the intersection here between Web3 and AI, because I believe it’s an absolute match made in heaven. What’s gonna happen is with the proprietary AIs, people are going to use them, but they will not be in control of them by any means. You know, like it’s, you know, from a societal trust perspective, would you trust like an AI Twitter to kind of make decisions for you guys? It’s so pivotal that people are able to have access to these open source AIs where they actually are in control. They are able to have the tools and the necessary means to develop their own AI for their own uses. And I firmly believe in that. And, you know, we know what’s coming. We, you know, it’s like everyone’s kind of used chat GPT, you know, and we kind of joke about it. And it’s like incredibly powerful, but that’s because it’s the only major game in town.

(37:48):

As that space kind of grows, Cob is like, you know, put it quite bluntly, you’re gonna start seeing those old like Yahoo kind of hyperlinks being, you know, put into your answers and your prompts. You know, I see, you know, Microsoft kind of being behind open AI or closed AI there. And, you know, you’ll ask it, you know, what’s a procedurally generated game? And it’s gonna spit out, you know, an answer. And within that answer, you’re just gonna have a soft plug or a banner ad in that, you know, walled garden in terms of like a paywall, where it’s like, you just, you know, have like 30% off for Xbox Live. And, you know, you need to interact with that ad before you can get your full answer on the free plan. We know that’s coming. So, you know, it’s just so incredibly early here. And I’m just so absolutely happy that we already have built a community and it’s growing day by day. And I’m glad we were able to kind of get something out there for everyone to use at this stage of the game.

cob analyst (38:57):

I was just gonna say to, you know, imagine if every time you generated a picture of car, it’s just always has a Toyota logo on it. I know there’s maybe some Toyota fans out there, but like that again is, I know it’s dystopian, but that’s what’s coming. If open source, free AI is not done properly. It’s a world I don’t wanna live in. I don’t think many of you would.

YosemiteSean (39:21):

I refuse to live in a world of only Toyotas.

cob analyst (39:24):

Yeah, I mean, Toyota is maybe contrived example, but you know what I mean. Few more questions in the chat. One there I think needs a little bit of clarification. Not quite sure what we mean by collateralize, but there’s one here, which is when will we be able to delete a collection? I don’t know exactly when we’re gonna release that, but that’s a kind of low hanging fruit for us. Might be something we can do in the next week or so.

YosemiteSean (39:52):

We’ve also got the question of, will there be any collaboration opportunities? So, you know, with artists collaborating on art, potentially collaborating even on models, et cetera, just anything you wanna share in that area. That is absolutely something that is technically capable

zkbrain.eth (40:07):

in terms of the framework. And for like, you know, royalties on the platform, if we weren’t able to do that, we wouldn’t be able to kind of set up model royalties. I would say in terms of priority for our projects, we wanna get the model royalties out there and, you know, as quickly as possible and out in production. I would say once that’s proven out, that’s gonna be something that we can work on.

(40:40):

You know, that proof of concept that feature is released, we will be able to kind of look into how artists and creators can divvy up portions of their royalties.

YosemiteSean (40:53):

Definitely nice to hear that, you know, that flexibility is capable or is possible in that, you know, it’s within the realm of reality for us. Do we have any developments on the collateralization question? I’m seeing no development there personally. Oh, but we do have one more. So this one might be a little more intangible. So take a minute there, gentlemen. But, you know, what exactly are we looking to capture? Well, you know, what market are we looking to capture with this product?

zkbrain.eth (41:30):

The AI industry, I mean, I kind of already mentioned it. It’s just right now it’s sitting at $300 billion, you know, in terms of the scope of how much Web3 intersects with that. I’d say there’s probably numbers that are gonna start filtering out within the year, but our strategy is truly aligned with the Web3 space. We have been in Web2 for many decades and, you know, I feel the opportunity here is some decentralization of, you know, AI as well as crypto. So, yeah, in terms of like market research, that’s not our wheelhouse, but that’s our target market for sure.

cob analyst (42:19):

Yeah, maybe some kind of sub-industries or sub-groups within AI, you know, think of model development, model distribution, like model infrastructure or just general AI infrastructure, as well as payments collection and royalty distribution. Those are all kind of, each in themselves, big, important industries within the AI space. Big, important industries within AI, like the services that allow AI to happen.

zkbrain.eth (42:54):

And just to kind of go back, I didn’t want to ignore the collateralization question. It’s not something that is technically difficult for us. However, I would say it’s more of a downstream right now. AI art is growing quickly. However, it is a subset of the overall NFT visual art space. Until that kind of changes, the NFT-Fi kind of space, I feel like if we went down that path, it would kind of dilute. We don’t want to spread ourselves too thin and have too much surface area. But yeah, if the economy or the business case makes sense, we will pursue that. But it’s not something that we’re actively developing at the moment, but we’ll certainly keep our eyes on it. I’ve got one more model creator question here.

YosemiteSean (44:01):

Now, this one, I’m pretty confident about the answer around, but I’m sure it’ll make some people in the community happy to hear it come from you guys. Will model creators need to hold any GAI to do ads? Will model creators need to hold any GAI to do actually submit their models or have them hosted on the platform?

zkbrain.eth (44:24):

No, they won’t. What will end up happening is the royalties are paid in ETH, and they’re paid immediately on the sale of the derivative art piece. So yes, they do not require GAI to create models. But again, we’re not going to kind of token gate the growth strategy of user adoption here.

(44:54):

It’s something when we reach a critical point of the economics of the platform, that’s where we’ll look at enabling those features. You’ve seen it before with even just NFT marketplaces. X2Y2, they subsidized it just a little over a year ago where there was absolutely no marketplace fees. So there’s other tokenomic mechanics like LookSquare where it’s an inflationary token. So we’re going to actively look at that. We’ll make announcements based on it, based on how much traction we get in the economics of the platform. But what’s unique here is we need to kind of get that decentralized compute out there. That’s where we’re going to create the demand, the buy and sell on the native GAI.

cob analyst (45:54):

Great answer, ZK. I’m seeing another question here, which is why is GAI on ETH and not another chain? And is there any plans to expand to different chains? So in our roadmap, we will have a bridge to bring GAI over to an L2. Of course, the reason being that we need lower transaction fees for our distributed GPU network mechanics to work. We might be paying a few dollars for a generation that is less than a cent. So we will have an L2 bridge for that there. The intent, of course, not necessarily to trade said token on an L2, but the idea being that to transact it would occur on an L2.

(46:41):

Hope that answers it.

YosemiteSean (46:43):

I’m going to say that definitely answers it. Perfect. Well, on that, is there anybody else in the audience who would like to throw up their hand for an in-person question before we start winding things down?

zkbrain.eth (46:56):

That’s great. Thanks so much, everyone. Really enjoyed this. Again, not everyone’s going to agree with us. I’m absolutely open to having more of a discussion and debate. We don’t want to be working in an ivory tower here. So just to embrace decentralization, I’m totally open to having differing opinions as long as everyone keeps it respectful. And I invite challenges and thought exercises. Those are a lot of fun to me.

cob analyst (47:31):

Yes, very exciting. Lots of great updates. We’ll have more next week. I think it’s Tuesday next week that there’s another community space, more of an artist-focused space that’s happening. So please stay tuned for an announcement on that. Hope to see you there as well. And just one more. It’s not a question, but someone in RTG said they’re excited to see where this goes, and that was one of the users asking a lot of those questions. So thank you for that feedback. They’re also very excited for staking, as are we. So I’ll leave it there, guys.

zkbrain.eth (48:09):

Yeah, thanks, IE.

YosemiteSean (48:11):

Thanks for all the great questions, guys. And thank you to everybody for coming out and taking a minute to speak with and hear from the community more. We’re happy to be able to share some more updates with you guys.

zkbrain.eth (48:22):

Have a great week.


Video Description

Recording of the Generaitiv Community Twitter Spaces where the team gives an update about the utility of $GAI token, staking and upcoming changes to the platform.

Speakers: cob analyst (Co-Founder): https://twitter.com/undervaluedjpeg zkbrain.eth (Co-Founder): https://twitter.com/zk_brain YosemiteSean (Community Manager): https://twitter.com/Yosemite_Sean

Generaitiv Links: Website: https://generaitiv.xyz/ Twitter: https://twitter.com/generaitiv Telegram: https://t.me/generaitiv Discord: https://discord.gg/9CdNGvUFKu

Whitepaper: https://content.generaitiv.xyz/Whitepaper.pdf Buy $GAI on Uniswap: https://app.uniswap.org/#/swap?inputCurrency=eth&outputCurrency=0x0d8ca4b20b115D4DA5c13DC45Dd582A5de3e78BF $GAI Chart: https://www.dextools.io/app/en/ether/pair-explorer/0x0d8ca4b20b115D4DA5c13DC45Dd582A5de3e78BF

Generaitiv is a community-driven AI platform built to empower AI contributors. $GAI will power the largest, public, and decentralized AI computing network in the world. The Generaitiv AI blockchain at a protocol level will initially launch as an L1 Ethereum token. $GAI will be used to incentivize GPU computing power to the network. $GAI has the potential to become one of the best AI tokens of 2023.


Sponsors

Podscript is a personal project to make podcast transcripts available to everyone for free. Please support this project by following us on Twitter.