
Digital Nexus
Dive into the thrilling world of data, digital, and AI with your superhero hosts, Chris and Mark. These dynamic duo consultants have built digital wonders in Australia and beyond. They wield their innovation powers at Digital Village and NotCentralised, respectively, bringing you the news, views, and opinions that are simply out of this world.
Mark Monfort, the tech wizard behind the @AusDefi Association and NotCentralised, isn't just a name—he's a legend. With blockchain fin-tech victories under his belt, he's now on a quest to build the ultimate #LLM, SIKE.ai, enhancing business workflows and securing data like a true digital sorcerer. Nothing can stop him!
Chris Sinclair, the design guru and UX/CX mastermind, knows the secrets of digital innovation and business strategy like the back of his hand. Partnered with Digital Village, a league of specialists leading the charge in product development and innovation, Chris is here to prove that the old ways of working are no match for the future!
Get ready for epic discussions, expert perspectives, and a sneak peek into the future of digital innovation. Don't forget to like, subscribe, and stay tuned for more episodes as we explore the frontiers of technology with a dash of humour and a whole lot of superhero flair...or fails!
Digital Nexus
Ep 31 | Claude, OpenAI and Perplexity strike back with huge announcements
🚀 Episode 31 Post-Google I/O Shockwave, Agentic AI Chaos & a $1M Hackathon?
🎙️ Hosted by Chris & Mark | Witty, high-energy, and absolutely packed with insights
Still catching your breath from Google I/O?
So is the entire AI industry. In this episode, the boys sift through the aftershocks—and unearth gold from the companies overshadowed by Google’s tidal wave.
🔥 Highlights this week:
🧠 AI Whiplash
Google’s I/O left everyone reeling—VO3, NotebookLM mobile, and that wild try-on shopping tool
Veo’s hyperreal video generation sparks laughs, awe, and deepfakes anxiety
⚔️ Agentic AI Arms Race
Anthropic’s Claude 4 drops new tools: Files API, prompt caching, voice mode, web search for free users
Microsoft Copilot unleashes multi-agent orchestration—build AI “coworkers” in Excel
OpenAI eyes dominance with GPT login features and memory tweaks
Perplexity enters the dashboard arena: AI-powered spreadsheet tools + live data visualizations
Mistral, DeepSeek, Opera, and xAI all launch agent-powered updates
🏆 $1M Hackathon Alert!
Bolt drops a global AI build challenge with 50,000+ participants
Chris & Mark are entering—expect a future demo 👀
Free domains, hosting, Bolt Pro credits included
🎭 Real Talk + Satire
AI-generated stunts? Cooking with plutonium? The guys joke (and worry) about the rise of hyperreal content
Need for blockchain-based digital IDs to verify real vs fake
Ethical dilemmas in AI-powered therapy
Links:
Google NotebookLM: https://blog.google/technology/ai/notebooklm-app/
Copilot Multi-Agent: https://news.microsoft.com/build-2025/
Claude 4: https://www.anthropic.com/news/agent-capabilities-api & https://www.anthropic.com/news/claude-4
Singin with ChatGPT: https://techcrunch.com/2025/05/27/openai-may-soon-let-you-sign-in-with-chatgpt-for-other-apps/
Mistral AI: https://mistral.ai/news/agents-api
Salesforce buys Informatica: https://www.salesforce.com/news/press-releases/2025/05/27/salesforce-signs-definitive-agreement-to-acquire-informatica/
Deepseek R1 update: https://techcrunch.com/2025/05/28/deepseek-updates-its-r1-reasoning-ai-model-releases-it-on-hugging-face/
Opera browser update: https://techcrunch.com/2025/05/28/operas-new-browser-can-code-websites-and-games-for-you/
🎯 Takeaway
"AI’s moving fast. Stay curious, stay building, and don’t get left behind."
📲 Hit play to decode the chaos. Because this week... was anything but normal.
Other Links
🎙️our podcast links here: https://digitalnexuspodcast.com/
👤Chris on LinkedIn - https://www.linkedin.com/in/pcsinclair/
👤Mark on LinkedIn - https://www.linkedin.com/in/markmonfort/
👤 Mark on Twitter - https://twitter.com/captdefi
SHOWNOTE LINKS
🔗 SIKE - https://sike.ai/
🌐Digital Village - https://digitalvillage.network/
🌐NotCentralised - https://www.notcentralised.com/
YouTube Channel: https://www.youtube.com/@DigitalNexusPodcast
X (twitter): @DigitalNexus
Welcome back to another episode of the Digital Nexus podcast. We have a few interesting things this week. Well, this today to talk about. It's been one of those weeks post a huge wave of Google announcements. Yep, it's been a whole heap of, um, just not Google announces for me. I've had a whole heap of podcasts and stuff, so I'll update you on all of that. Um, there's a whole heap of insights from the the world of AI in terms of like how to build some big hackathons that you should check out, wait till the end of the show and we'll show you more about that. We'll be participating and stuff. But then we also cover the news and just really interesting, just progress continuing to go along and our opinions on all of that. Yeah. Claude, Microsoft, um, Google R1, a bit more Google stuff. Lots of things happening. We're not gonna waste your time. Let's get into it. See you there. And bright new light future forward in your ear. I broke into a zoo to prove one man is enough to fight a gorilla. Welcome to the Chernobyl noble challenge. I'm going to lick this glowing pole. Let's see how many views this gets. What the heck? Licking Chernobyl. Oh my God. Just content bathing in liquid cement until it hardens. Let's get solid. Get ready with energy drinks. Just gasoline. It's like a Zoolander digging to the Earth's core. Bare hands, no brakes. Can I survive a full latex suit in 100 degree heat with no water? He's gonna lose a lot of weight. Oh my gosh. Staring at the sun for ten minutes straight. Wish me luck. Counting every grain of sand on this beach. Let's go. One. Two. Three. That's not right. I'm doing K2 the honest way. Alone, unprepared, and in a straight line. Leave a comment for the search team about to do the first plunge into an active volcano. Let's send it. Go, Chad. We spent $50 million to rebuild the Titanic just to sink it again. Last person to abandon ship gets a Lambo. I covered my entire body in compost and waited to see what started growing. Running for my life on North Sentinel Island, day three. Should have picked. You can see some fake eyes. All right. I've sealed my head in this plastic box. And I'm gonna try to read the entire dictionary before I pass the light. Reflection. That's crazy. North Korea with a lawn chair and 1000 party balloons. Don't try this at home. I'm about to survive 100 days in the African savannah. Only eating and drinking. What? My body already gave up. Basically, the circle of life. Hey, guys. Welcome back to the channel. Today we're cooking with plutonium 239. It's hot, it's dangerous, and it's definitely illegal. I have to eat diamonds for views. I hope you're happy. Algorithm. Wow. I can feel that. I can feel that. But like the movement when he picked him up, like it actually. That's insane. Welcome back. Chris. That was I'm like, this is. It's pretty scary time if you think about it. Like scary time. It's going to get to a point now where we're not real. We were already we're not real. Surrounded by fake news. Pre I believe in the pre theory the prom theory. You know, one thing I've noticed. Maybe this is the conspiracy theory in me. But like I'll be talking about something and then all of a sudden it'll appear like I'll be talking about something with Alana. Right. And like, a surname, for example, a weird surname. And then all of a sudden, like a day later, I'll see it come up on someone's post in LinkedIn. It's like, it's a weird random listening to you, man. It's listening to me. It's definitely listening. It's it's oh my God, let's not get there. Um, but yeah, folks, what a week, what a week. Well, I mean, actually, I would say less than what, a week this week? Less than what? A week after last week. Yeah, I know, after the what? The bombshell last week. This is it's still in the news. You know the Aussie comedian, uh, Carl Baron. Yes. He had a joke where it was like, um. Yeah. You gotta like, be you gotta time. Well, um, your surprising things so you can't go with, like, the least surprising than the biggest surprising thing because it's like, hey, I found $5 and you raise your eyebrows, and then you tell the person, and there's life on Mars, but they can't raise their eyebrows any further. Like, where did you find the $5? You know, so that's what the week's kind of been like. It's been a big week as well. It has, it has. But just because we've had the hundred updates from Google, like it's hard to top that. It's kind of like I feel bad for all the other businesses out there, because there was actually quite a lot of other news that happened that week, um, that were just completely overshadowed. I mean, the Claude for you had, um, OpenAI did a few releases on their code updates. Then you had Microsoft did had actually they had their some kind of I o software on it. Yeah. They apparently they were doing stuff. They launched they released a whole bunch of big things for business and corporate. Completely overshadowed. Completely missed. Like, it's just, um, nuts. I think Nvidia I mean, they had some pretty hectic tech updates. And again, just like all the news was just Google, Google, Google, Google, Google, Google, Google. Yeah, it really was. And, um, you know, we're gonna, uh, I will be trying out the Vo three stuff. So hopefully I'll have some stuff for you guys for next week. But I mean, I think we we can do the intro. The intro is enough. Just the, the quality that it produces. It's insane. Like there is so much realism. You can see you can see some fractions in the background. Like things are a little bit like when there was being chased by the, uh, the lost African tribe and stuff, and then you could see some morphing of people in the background. Fine. But like the car show one and like these other ones. Crazy. Yeah. Like exactly like you just read that. I do wonder if there's going to be a Sydney like if you describe, like, Sydney Opera House and interviews random strangers there, if it will take it into account. That's what we'll play with. Let's play with that. I'm gonna. Yeah, we'll do that. We'll do that. So. So that's really good. Um, Chris, the week that has been, what have you been up to? It has. It has been a week. I've actually been knee deep in a lot of, um, customer research working for this not for profit, um, synthesizing a lot of insights and data and trying to think in my head how I can, um, identify a lot of the processes that I've been doing because I still I'm using a lot of AI tools to to actually run the research pieces, to then pull it and like, investigate and explore, um, the outcomes of those research pieces, synthesizing it and then creating opportunities where I tend to get a bit more manual in those processes. But I'm jumping between tools manually. I'm just trying to explore how do I identify those processes, because some of the tools I'm using, they don't have APIs or connectors. So, um, I need to yeah, that's kind of the thing I'm playing with right now. Um, you know, an interesting topic and I'll hopefully be able to show this later. I've got some collaborators that, uh, will be working with, but, um, MCC were supposed to be model context protocol folks. That's the one that was supposed to be this common standardized language that helps you connect to other apps. But there's issues with those, and we might even have a special guest to come on and talk about that, because whilst it's all sexy at the surface level, it's it's always in the detail. The devil is always there in the detail. And as my devilish colleague Arturo will say, it is, you know, the devil's in the detail there, and you find it when you actually start to apply these things where they don't quite work. And there's, um, what these guys, these other guys have done, they've gone into the hard AI side of things, like actually into the automation software, the coding, not just the vibe stuff, but actually getting into the weeds, proper data science work to find the actual ways that you can say, um, properly connect to zero. You get an invoice in your email, and then it extracts the data and pushes it out to zero in proper detail rather than just, oh, MCC is not quite working for us just yet, but I think there's a good mix of that kind of stuff coming. Um, and it's this back and forth, right? Like it's kind of like there's fear mongering on the one side. And then there's also like overhype on the other. But what really sets in, what really works is like something in the middle where it's a little bit of the gen AI stuff, but it's a little bit of traditional. But yeah, interesting stuff. And the protocols are still new. I mean, it was started by was it anthropic that started? Yeah. Yeah. And like it's still at, it's at it's basic. It's a connector. And so if the protocol can't understand what it's connecting to properly then it's not going to, it's not going to do the thing you want it to do. So understandably that it is going to have those limitations. And as it is such a new technology that not everyone has adopted yet. Yeah, it's funny though, um, because we do see this with, uh, with clients. And, you know, I'm kind of getting into what we've been up to, but, um, it's it's like if you take them to post 20, 22, the world that they're in, even just using ChatGPT and let's call it manual AI, which is really funny, right? Because it's like AI is such a magical tool already, even just you could take someone that has never touched I to November 2022 and give them ChatGPT then, and we will still be able to find a way to make that a much more efficient process of whatever they're doing. Their world is so much better than it was before, but yet we feel like because of all the things that have been going on, that's the manual. Yes. Going back and forth. Oh, what a it's kind of it's the wave of of adoption, right. Like you adopt the thing, it becomes integrated and then you're like, oh, but but I want it to be better or I want to be able to do this now. And now I'm spending too much time doing this other thing, or now I'm switching between like, there's always there's always something until there is nothing type. Yeah. You mean. I have to think? I have to think I have to understand what a baby's toy. Kind of just understand my mind. Yeah. That's. That's the world that we'll get into. But, um. What about you, mate? What's, um. What's been on? What's been in your world this week? Uh, a couple of things. So, you know, for example, we had our online meetup for the, uh, Blockchain Association. I showed this because there's I as part of it, I'll turn off the sound. We had some great guests. Um, so on the blockchain side of things, we were giving updates. We were talking reports and data from Chainalysis. You collect a lot of that. But the cool thing is on the D5 website, because we built it with bolt, which we'll come back to later on. You get to see, you know, the insights and the updates from the simple explanation button, the FAQs or better yet, you can ask a question. So, um, so did Mark say anything funny? What do you think, Chris? Did I for this? No, I've never seen you. I did, I did, I said make Mike funnier. Okay. Yeah, not too bad. Which I is this using 4.1. It's using GPT. It's using GPT. That's why I was saying that you were funny. Oh, it's a shade thrown. If it was Gemini, it would know the truth. But, uh, you know, this is the thing. Like, you're able the way. And I've been discussing this with other association, um, folk and stuff like industry bodies, the way that we present things like running an association and doing, you know, um, these gatherings of like minded folks and presenting it on a website. It has to change. You know, this is I want to see more of this from others and stuff, but that's what I've been up to this week. We also do in this very studio. We'll film it after this, um, a new podcast. And thanks to Chris for helping set it up and film it. And you would have seen my colleague Arturo there. Um, but it's called Fex Insights. And so this is the website that we build for it also, you know, with I and also shout out to them for the space. Right. This is massive shout outs folks. Uh, you will see more of this space and um, the evolution as we get it all set up. But you can see Arturo here talking as you know, he he talks and talks as, as you saw. But we can do the eye thing there as well, where we get a bit of an update as to what is going on. So I cover the news in the blockchain space. It's more for traditional institutions that are starting to get to that intersection. So I was up to that. And also just, um, I just wanted to say that this week, um, this episode we're doing is the penultimate, the second last podcast I think I'm doing in terms of podcasts and webinars about nine. I've done this week, um, some myself. I did two last night. I did two the day before. There was like a webinar or two thrown in there. It's slow yourself down week. Yeah. Anyway, I'm looking forward to, uh, tonight, uh, it's Friday folks, when we film this Friday the 30th of May. Um, and it's going to be, uh, yeah, it'll be a great weekend because I get to rest. Finally, finally. Jesus, mate, you are all over the place, I love it. Shall we get to how much it is going on right now across just digital and tech in itself. Like business is changing when we're in it, and you are deeply in the thick of it. I'm on the sort of the edge of it, playing with the stuff that, you know, that helps people get things to market. You're like, right in the thick of the weeds with the, you know, innovations, new products, go to market, developing businesses, coming up with podcasts and stories every single week, like, oh my God, there's no stopping for you. Thank God there is. I, I mean, I am I am just the human receptacle, the flesh bag that delivers the what the I wants and and and also, you know, things about it as well, I guess proof that it does allow you to do more. Yeah. For the articles that have come out going, oh, I is not really making thing faster please. I had I did have um, a bit of a debate with a well known, um, no, I wouldn't even call it a debate. I didn't respond further and stuff. I've just been busy. I will, you know, jump in more later. I think I'll get, um. I think we need to like someone. There's just constant, uh, fear mongering kind of stuff. Um, but anyway, uh, more in the Founder's Journey newsletter that is out today that you will have seen last week. The interesting thing for me is, um, in talking about that is from the adoption standpoint, and I've got to be careful because I'm not I can't really say technically what it is or what they're doing, but like our big four is, for example, um, and I was speaking with someone around, you know, what are you using in your, in your business to expedite or speed up the processes that you're going through. When it comes to either a data analysis, for example, it could be even just, um, they've run a research piece, they're exploring what customers think about their products and their services. Yeah. Um, and they, you know, they interviewed a couple hundred people. They have a whole bunch of these questions and queries all put out in a very, very detailed spreadsheet, you know, including stats and data and tools they use and all that type of shit. And I was like, all right, how would you synthesize this? Like, what process are you going to go through for doing this? And they're like, well, we don't really have anything. I have to do it manually. What do you mean? You mean we have people? Well, you're saying people. People. So they have a lockdown version of ChatGPT, which can't do any data analysis. Okay. And then the only, the only thing they have on top of that is copilot, which I mean, which we chest some queries in, and it is awful. Yeah. I mean look awful. So they can't use it. They literally can't use it. They can't push the data. They can't manipulate it. They have to do it all manually. I think it's all like context, right. rate. Given that we know that the models, whatever model, even an older model, can be good if it's given, if it's given the right kind of context. Um, maybe that's why copilot is is not doing the things that people are expecting it to do. Or we see, you know, whether the government's been using it in different government departments failing a copilot or they say that, you know, we did a trial and it didn't quite work, or, um, AWS, uh, doing another trial with another government department doesn't really matter. The model, they're all transformers and stuff. Um, yes, they're probabilistic. And I dive into that in a podcast I did with, uh, Raymond's son, who's a lawyer slash tech guy. Um, but because they're probabilistic, like, you still need you need to read the outputs. And if you can steer it towards a, um, the right context and the right kind of steps it needs to take to get to what you want, then it shouldn't matter. Like, okay, not the oldest models like GPT two. Yeah, but like even just GPT 44.140, these ones will get you to what it is that you need, but you need to read the outputs and you need to steer in the right direction. But, um, for when you're dealing with data set, it's even different. Again, right? There are very specific models that only specific models that are good at that because they're still, you know, they're not using function calling or are you just using the actual model? No, because they don't have access to the function. Well, it's always copilot. Yes. They're using a function call, but it's not very good at saying do pivots. Or if you say, hey, I want to know what the average thing is being. Yeah. If you don't want to know what like the medium is or what people are saying specifically around this topic or what are the trends? Yeah, it can't do that. It can't read a large data set if you transcribe it and then pick out important things. When you say a data set, are you talking about words or numbers? A mixture of both. Mixture of both. A mixture of both. And that's the limitation of of this integration of their copilot. We've always thought that um new numerical stuff. So words it's a transformer. It's a language transformer. Words it's good at. But for numbers like it never trust just on its own what it comes out with. Even the anthropic showed that the reasoning that it comes up with it will make up reasoning to backfill how it solved, you know, 45 plus, um, 24 or whatever it is. It back solves it and shows you something that sounds right, but it might not be how it actually did it. So you can't trust it for the math, you have to check it. But even then, um, if it is function calling depends on what it's able to do. Like there are things that have been tested, but man, it is hard. Like you've got something, you've got data science tools that can just do that. But anyway, um, it is this constant game of finding the right mixture where like, uh, um, experimenters putting plutonium and chemicals in and jumping into the lava as those guys were before. But yeah. Fascinating. Should we get to the news? Yeah, let's do it. All right, over to the news. And we start off with something that I use a lot of, and it's been just getting more and more and more updates. I love it. Like the biggest thing apart from, like the launch was that, uh, and, you know, this is the one where you can put in anything, even just, um, abstract sentences like I did, um, McDonald's. McDonald's. Kentucky Fried Chicken and Pizza Hut 70 times or whatever it was, or across multiple pages. And it turned it into a full on podcast about it. But then you could enter, you could enter the podcast and you could talk to it live. Right? So that was the interesting something. News going on. What is it? Uh, they have well, I think it's been out for like a week now, but we haven't probably talked about it. But they released an app the first thing. So now notebook is pretty much available to anyone for free on your mobile devices. Highly recommend if you haven't had a chance to play with it to to jump on in it. It is a new app. They've already had the app. I had it as an app. You had it as an app. It was a new app. It wasn't available on all devices and stuff. I have Android, I have a Google phone. So like, can you imagine if that was last? So new app, new updates, bigger things, more devices, uh, more capabilities. It is now able to, um, you can already upload a lot of documents and summaries and stuff like that. Yeah. So now they have an offline mode, which is really cool. So you can actually interrogate the things that you've uploaded in an offline state, as long as you know they are locally accessible. Um, and even listen to that offline audio as well, so you can download the content and actually interact with it. And now you can share it anywhere as well. Great. Exactly. Sharing features. Um, it's just a new way to pretty much understand anything, anywhere, was what Google called it. So this is a really cool update. If you haven't used it, check it out. It's a great way to just manage projects insights information in a small, secure kind of format. You don't have the full, extensive capabilities of if you're setting up something like a project folder in OpenAI, um, and asking to, you know, create big, interesting, elaborate summaries that you could put into documents and projects. It doesn't have that level of capability, particularly the free version. You are limited to the model. Um, but you are able to find insights and query things really, really well and really easy and obviously create audio off the back of it. Um, and as you know, over the past couple of weeks of the announcements, like even two weeks ago, they expanded the language model so that you can have different you can talk in Spanish, you can talk in Mandarin and all that. So, so now if you're someone who prefers to listen to your podcasts in, in Mandarin and you upload a bunch of documents, you can say, hey, can you summarize this in Mandarin? And boom, here it comes. Do it in Spanish. It'll do it in Spanish. The I think there were like 20 odd languages that they expanded to the translation components. You know, uh, it will completely transform just how we communicate. Yeah. There shouldn't be any of these, like, barriers of understanding. Um, the only barrier is, did you have you got the right features set up to allow your users, your the people you use, your stakeholders who you interact with, even family members. Like there shouldn't be misunderstandings. There will be, because we're human. Will find a way. Anyway, over to Microsoft. What's going on? Yeah. So I think swept under the radar. I just wanted to cover off this one particular thing, because we do like to talk about that business and relevance and what we're doing in the, in, in our daily jobs. That's what. That's what we see. Where we see the biggest benefits of AI. Um, and so for our copilot users out there, I'm not a huge fan of copilot myself, but I know it does have a lot of relevance in the market, particularly with organizations that are finance and consultancies. Um, you know, it's very well integrated into, um, you know, your 365 experience. So you're not having to pull data out or pull your content out into other services, other AI models. You can utilize it right there inside of your documents and spreadsheets. Um, but they had a great update recently around multi-agent, um, orchestration. So copilot now has the ability to essentially you can create AI co-workers to complete tasks right within copilot, so you don't have to use other tools, um, to build these types of agent models. You can actually build them directly in copilot themselves. I personally haven't jumped in how to play around with it, but I thought it was a very interesting, powerful thing, particularly for people who are heavily integrated in the copilot experience. um, and where it's relevant. It seems like it's going to be very, very powerful. So definitely check it out. Don't know if you've got it up here. I think this is just more video summaries and there's a whole heap of video summaries. So it didn't um, the link didn't show exactly where it was, just a really a news article I saw about it. It was very. So I thought that was interesting just to share with people because like, if you're if you are part user, definitely worth highlighting. Um, but definitely the more yeah, the fun things. I mean, you spoke a lot about Claude for last. Have you had a chance to jump in and have a play around? I have to say that no, I haven't because it's just been really busy. Um, a with podcasts and B, um, you know, we so we use, we use our own kind of tool for that because I've got like agents kind of like working in the background doing things that we want. We'll have to try out the other tools because it's like, hey, we need to probably upgrade the stuff that we're doing. But more importantly, um, I have probably experienced Claude for if it's been pushed into tools that we use like bolt, you know, for example. So maybe that's been there. But I think they put four into bolt. Yeah they're still in sonnet seven. Yeah yeah okay. Which is I keep looking up I am like news. I'm like refresh bolt, come on. When are you putting four into it? Because it's not that old. It's just really funny and stuff. So I'm going to get it done. Yeah. No. It's interesting, it's interesting. Um, some really, uh, any apology. If they have done it, it definitely haven't announced it. They definitely haven't put it out there that they're utilizing it. But I have been constantly refreshing. But please keep going. Oh, I was just going to say interesting updates that we've got here with, um, the MCP connector that, uh, you know, is they're the ones that started it. And uh, others have continued to develop off of that, but just the continued integration of that into their models. They've got this new files API thing, which, um, they say simplifies how developers store and access documents when building with Claude. So, you know, instead of like, and you see this whenever you're, you're working with, whether it's a bolt, loveable or any other kind of tool or even just ChatGPT, you're not coding, but you're you constantly have to, um, put in the files that you need for each request. So what did ChatGPT do? They they created projects where you can have a limited number of files, like up to 20, you know, for the time being. And there's probably reasons for that because otherwise if they just let you go ham go wild, they their servers would be overrun. But for each project you can put in up to 20 files. But then that doesn't mean that you stop there. You can. That's your reference files. You can still then add in more files for each request. But it's that problem of having to do it for each request. And especially with coding tools, there wasn't a way that you could just have a library. It's like, here's my images, here's my logos, here's, um, core instructions. Now build this up. And that's something that you want to take for each project or tweak for each project. Having Claude, um, be able to do that with their files API, I think is really interesting. The next one is something that I really think is not not so exciting for people like myself, but I, um, you know, for businesses who are and you talk a lot about agents, your business is running agents, and you are one of the biggest things when it comes to running really, really complex queries is memory. And that's the ability for essentially as it's the best way. Easiest way I can describe it is as you're creating an action, obviously, as soon as that action is completed, if that needs to move into the next action, that piece has to be stored. And then the more actions you're completing, you have a huge build up of memory and information that the next element may be needing to query and base this off on. Yeah, it's interesting. And with a lot of models, there's limitations on how much time and how much memory that is. Yeah. So they're extending what they're calling prompt caching. Yeah. Um, so so that is going to be interesting in terms of helping out with that memory saving time in terms of uh, you know, not being not having to, um, be so onerous in terms of bringing back like every single thing that is required there. But, um, that it says that it's going to reduce latency up to 90% for, um, by up to 85% for long prompts and reducing costs by up to 90%. But, you know, one of the things on memory as well is that there's a trade off. So what it is is that, um, you know, ChatGPT came out with news. It's like, hey, we can reference every single chat that you've got. When people looked into it. It was really just referencing like a high level summary of each chat rather than the details. Because if you can imagine over two and a half years, um, all of those chats that people have, just having that as something that is just there, referenced by the memory can't necessarily do that without, um, some, some latency. If we, if we reference my memory, like I've scrolled for a good couple of minutes before I get to the bottom of my ChatGPT divide my world, my ChatGPT world, up into the serious and not so serious stuff. But that's exactly the point, is that, um, you know, you can you can do this yourself. You can design a tool that, uh, for example, just a simple chat thread like a chat bot, either it's, uh, a goldfish and every single request, even in the same thread, is goldfish mode, and it doesn't remember anything previous. Or you say do the previous five prompts or just do the whole chat thread, for example, or look at my chat thread and also be able to reference all my other chat threads. You can set things up when you build your own, and when you build your own, you start to realize how these companies work and how AI works. The trade off is is that the prompt that you run will be so much larger, right? Like imagine it's a prompt that's only, I don't know, like, um, it's only 20 conversations long. But then if you wanted to read everything that you've got now it's 100 conversations long. Yeah, of course it's going to cost more tokens. It's going to take more time. And these things are fast. But trade offs need to be thought about. So people are trying to figure out memories in smart ways. Prompt caching is part of that. Yeah. And so essentially what you're also saying there is that even though it is cheaper to do maybe the same query. Now the idea is that you're doing more, it's probably still going to cost extra because you're able to do more and more and more activities, which has as a central cost as well. There's a way to measure all of that kind of thing. The one of the things that someone was testing online, apologies, I forget their name, but they identified it was up to now you could previously, I think the memory was up to like 30 minutes or something maybe. And now you're doing up to like 3 to 7 hours, depending on what type of query which for someone. And that's if you're running your model locally. So if you're applying into a tool bolt, which is a good example of that, even the tools that you've built, um, which is a substantial amount of memory and I'm not sure how that memory is actually that prompt caching is, is storing information, whether it is doing summaries as you highlighted or it's doing the full, uh, the full storing of every single information, particularly as you get down to like the first few queries as it builds up. Um, but that is that's a huge amount of, of caching and which makes, you know, the outcomes and possibilities from an agent perspective. Super, super, super super special. Yeah. Like having a proper Jarvis that actually knows you. Yeah, exactly. You got to remind it each and every time. But you know we'll see how this one is. So more on Claude though in terms of their announcements is um, the Claude they expanded their context aware web search and voice mode tools as well. So competing with the likes of what OpenAI has already done, or what perplexity is doing and even what uh, obviously Gemini um, integrating into their search now. So, um, anthropic has expanded their web search capabilities to all Claude users. So it's not just now for, um, for paid users as well. They are opening up for the free plans, obviously in unlimited, um, and beta version. So they've got voice mode and, um, for its mobile application, which we have already seen in a lot of the other AI models. Surprise has taken them so long to do this, but it is in beta, so unless you have beta access, you probably won't be able to see these updates. Probably regional, uh, locked as well. Um, so these enhancements follow the release of Claude for, uh, which happened a couple of weeks ago, um, which is I think called opus opus for um, which is obviously extremely good for coding and other benchmarks. Um, so at the moment it's a very English conversational, um, it's got the tap to interact functionality, the same stuff we see in do I really want to connect it to my Gmail? I don't know, I would, I would why not. Yeah. Why not. Um, yeah. So that's pretty interesting. So I mean, one of the last of the big players to enter into the voice chat mode. But, um, you know, pretty, uh, should be a bit of fun, particularly for people who are more heavily invested into Claude and having a look at some of the benchmarks between Claude and Gemini and OpenAI. They're the big three players, I guess, on the on the field right now. Yeah, there's others out there, but still. Yeah, I think they're the main ones. People are really, really liking some of the output, um, that Claude is doing, particularly from just a, just a, just a language framework in terms of how it structures things, how it talks back. Yeah. People like it. They find it friendly. That was one of the first things that was said when it came out. People just like it's back and forth, um, how it's structured and thought about things or how it was just like, this is all preprogramming, it's all custom instructions. Um, but yeah, that it's exactly done well with that. They really, really have and, you know, not no comparison on whether it's getting information right or wrong, but yeah, just just the basic friendliness of it. And obviously we've seen a lot of stabbing at OpenAI's ability to exaggerate and be a bit too kind at of things. I think, like anthropic has just found that nice, where Gemini has probably gone completely opposite direction. It's more like, you know, this is the matter of fact. Claus found that sort of nice little middle ground where it just feels like, you know, you're having a conversation with potentially an AI tool or human and, um, in the literal sense. So that's pretty cool. I can't remember if it can create images, but I can't, I can't I can't create it. No, no, no image generator. What's the next one, Chris. Um, so that was that's all the big stuff. I thought we'd go into some rapid fire news. Um, good old ChatGPT. They're launching some sign in capabilities. Um, so essentially, you know, you may be subscribed to other platforms and other tools now, you. Well, not now, but coming soon. You're going to be able to. Do you know how you can use Google to sign in to stuff, right. Yes. You know, you can potentially maybe using your Apple ID to sign into stuff or use ChatGPT. Now they're going to ChatGPT I think is becoming has such a big following. It's now you're going to be using ChatGPT to sign into things. Why or what benefit that this is going to provide? I do not know, but apparently that's what they're going to be doing. Um, they obviously have a big vision around just this AI tool. Knowing you and understanding the things that you do and how you do things, I think this is probably that first step in that play, in that it's like, cool. If I know all the tools and all the things that you're signing into this, I mean, this is why Google knows you so well. So if you use Google to sign into everything, it knows everything that you do. It knows you have an open AI account, or it knows you have a particular new subscription account, and it's able to build information around that stuff and then serve you news and ads. And that's crazy. Or in OpenAI sense, they'll probably do the same thing. But battle of the AI. Yeah. Battle of the AIS. Gemini like sorry. Yeah. So Google's Gemini I can imagine a scenario where you can use ChatGPT to sign in to some, um, whatever it is, like, uh, you know, it might be your Gmail. That would be hard to see, but but maybe. But or another app where Google's Gemini also has access to. And you, you give it a command and it goes well based on your emails. I don't think you should do that. Yeah. And they you have the eyes fighting each other within the apps that they're allowed to sign into, and they just go off and you're just like, I can't deal with you guys. You're like divorced parents. Like, what the frick? Anyway, fight. Doo doo doo doo doo doo doo doo. Next. Um. Awesome. So telegram has just struck a deal with, uh, good old Elon Musk's AI service. So grok is now going to be laid into telegram. Um, it was like a $300 million, um, payday out for, for Google Telegram in this deal. But essentially they're going to become the AI model that backs telegram, which is a huge move and a huge play, that one. It was spoken about quite a bit in the dev community. Yeah, because so it is a partnership. It's not a buyout. Something like oh what is this. Yeah. So it's a partnership. It's definitely it's not a buyout. So it's kind of like you know Google being the home page of Apple's, you know, Safari and other other browser tools. So um, grok is going to be the AI tool of telegram until someone probably bids up bids outbid them, but they do have a they do have a good share ownership in the in the business as well. I'm in a few telegrams and um, it's really interesting. I need to do it because like, it's, uh, part of, like it's part of blockchain community related stuff. And it is, um, you know, the thing that I could see in there is like, people are arguing. It's like, hey, grok, tell this mofo he's wrong. Yeah. So let's see, let's see. That's that's interesting. Uh. What's next? So Mistral has launched their agent API, which is pretty interesting as we're seeing a lot of open source, um, program or tools, AI models, um, moving down this route, expanding their capabilities, trying to bring people on board. We saw Meta's now doing their whole startup program. Um, you've got even R1, who's released a few agent updates over the last couple of days as well. So now Mistral is is joining that battle. Um, so these agents can perform tasks, maintain long term context, and manage multi-step workflows. Other AI it is again, it's open source so it may not have the full extensive frameworks of things like Claude and and open AI, but the fact that it is open source and it allows you to do things that you know, at the cost of your own machines. Um, so that's really, really interesting and really, really powerful to see. I love the fact that Europe is stepping up to the, you know, stepping up to the plate here with a lot of their codes. I hate the fact that it's pretty much a China America battle right now. And Mistral is there holding holding the lines really, really, really nicely. Yeah. It's it's uh it's really interesting. You can see the video playing in the background there. What they've got with their GitHub, uh, demo integrations etc.. Yep. It's, it's yeah, it's interesting that there is more of this push on the open source side of things. Like it sounds crazy, but the Google news this will get old, you know, give it a week, give it two weeks. It'll, uh, degrade and people will just get used to I don't know, maybe people won't, but people will. I feel like we're going to get used to seeing these, like, crazy ass videos, and then they'll be like, ah, like, nah, that's not that's not great and stuff. Yeah, there will be a way for you to actually look real. You know, it could be like hostage videos, but in reverse, where you hold up today's newspaper and it actually is that how do you how do you know that? It's a it's a real newspaper. We just don't know anymore. So interesting. Interesting. Speaking of the not knowing. So coming from the world of blockchain and stuff, um, there is a concept there where you can, uh, have digital IDs and it's not a digital ID where you everything about you is just known and it's transparent to everyone, but it's got privacy features in there. But the point of it is that you can have something, um, that is cryptographic that only you could have if you have a capability to sign a video, to sign that video and say that you're the one that created it. If you see other videos out there and it doesn't have that signature that directly comes from you. I feel like we need, um, a, an ID context protocol or some sort of standard out there in the AI space. And it has to be blockchain, because that's the only thing you can't. I was going to say shout out to all my database like friends and stuff, and thinking that I would just do like a database thing. Databases can be tampered with, log files can be tampered with. Whoever has access and control to that. But when you run things on a blockchain and it's done with both the cryptographic side of things, but the privacy preserving features of zero knowledge proofs. Look that up. Ask I, I reckon that's going to converge and be the way that we can prove that videos are real or not. Is that something we can do at a low zero cost. So zero cost? No, that's that's one of the blockchain is, is obviously gas fees and things like that. Like how do we start to do it in a way that we can track and ID those things at at lower cost? What's the what's the method there? There is already like low cost blockchains, um, out there. So there are already ones that are they're either aggregating transactions together and for some they've done that in a way where it takes transactions off chain, it takes it offline and actually does it. So you have to trust the organization that's doing that. But um, even if you do that, that's still better than just only using databases. And there are blockchains out there that are low cost where you can do all of this kind of stuff on chain. I won't go into the details there. What's the point? What are we talking about here for cost? Um, it can be fractions of a cent. Yeah. Nice. You know, it can be up to a dollar. It can just depend on what it is that you're exactly doing. But, um, what Chris is referring to there. And like I said, I'm not going to get into the details here, but there are blockchains where it depends on supply and demand that the cost of transactions goes up and down, like one day it might cost you $1, another day it might cost you like $40 to do something. That's not the types of blockchains that you would use. No, no, that's what I'm talking about. No, not at all. Well, you're talking about demand on gas fees and stuff. Yeah. Yeah. Well I'm talking about the fee for actually running. So when you're putting the, the ID onto something, there's a cost associated with that. And some of them are expensive and some are not. But there's ways around that. But I thought maybe you were referring to, like, the high and low cost, like the changing cost of some things, because that is also like a problem. But in terms of just having low cost stuff, there are ways out there that this can be done. So. Um, and even if it's like, okay, it's too costly to do it on a transaction by transaction level, well, it can aggregate those things. And it's just like how, um, how we batch those getting into the details. That's not the point of this episode and stuff. We can dive into that in another one. But there are ways and I think like the I think the key point there is that there are businesses like Google can afford that. And I think it's it's going to come to a crunch time where it will be responsibility on them being able to, you know, identify, post it being rendered that this is a fake image because it will drive people to, you know, believe things that aren't real. Yeah. It's the the nature of the beast, unfortunately. Exactly. So, uh, what's next? Yeah. Um. It's kicking. Uh, staying in the sort of the business domain world, Salesforce acquired a cloud data management firm, Informatica for $8 billion. Is that all? That is insane. So I don't know if anyone who doesn't know what Informatica is. They're a data management cloud, um, organization that simplifies data access. They automate lots of like, intensive processes around data and AI readiness and do a lot of enterprise level sort of transformations and integrations. Um, and so they acquired them with a big move to obviously enhance their AI capabilities across data and cloud services. Um, so really massive move there with them, especially for that price. I mean, just talks to how much bloody money that Salesforce has, but $8 billion. Speaking of Salesforce, they had quite, I would say high costs. I think I saw something around like the 40 cent mark. And that might have been us in terms of like how they do their. Yes, sorry, how they do their um, what is it like their agent kind of queries. But I think they lowered that recently, the last couple of weeks to like $0.10. So there's a cost to doing that. Um, but still like I like, I think, uh, whether that's cheap or not, it depends on the, the output that you get, the quality of it. But yeah, just getting more and more, um, depth to what they're providing with Agent Force. It is a big deal. Um, you know, if we were speaking many years ago and stuff, just talking about, like, uh, data, then this would be, um, major, major news for a lot of folks that are looking at this as like, from an AI perspective, they will never have heard of Informatica, but this is something, um, $8 billion is nothing to trifle with, you know? No, no. And and particularly for Salesforce, which is, you know, they're holding on in the AI space, I wouldn't say they're one of the leaders. They do talk about it a lot, and they're doing a lot of stuff. Um, obviously, because they have such a foothold in the market from a from a data CRM, um, you know, analytics and management kind of perspective. So this is a huge move for them to improve how their capabilities and their services to stay relevant in that market. And as you mentioned, Agent Force is there is one of their biggest, um, their AI outputs within the space. And so building on that capability from specifically from a cloud. Uh, apologies. I have a bit of a bit of a throat from our perspective is we'll clean it up with a no joke. Yeah. That's right. Um, uh, keeping things moving so deep. Speaker one um, as their app is back in, back in the headlines, they're like number one for AI App services. They weren't yesterday, but at the time of this article, they were number one on Apple and, uh, androids. Yeah. Um, uh, storefronts. So I don't know why it's suddenly kicked off in a bit of a boost up there. Um, but they have updated their models recently. That could be part of the play there. Um, you can actually download the model though, on hugging Face, so it's not rolled out to all of their applications and services. So it's only in the, um, in any of the local instances that you run. Um, so they've been improved. They've improved to a small extent, their reasoning model. Um, you know, you know, as, as they should keeping updates going and moving out. Um, so no solid announcement on R2 or anything yet. Um, but they are, um, you know, still still there and still playing hard and from an open source perspective, as we talked about a moment ago, probably one of the better ones out there to be playing in, even if they are from China, their deep league moment. I remember, like going way back. I feel like a grandpa in, I now know, but like the old I remember back in my day when I won came out. Yeah, exactly. That's exactly it. It's like it felt like a while ago, but, um, and how it really it made, uh, a lot of companies that were traditionally in this space do a bit of an about face and go, oh, what is this, this approach that they've got, um, really upends how we do things. And so the deep version, that's why we have in ChatGPT and perplexity in others, their version of deep search, deep research, deep something, you know, they started because of these guys. And these guys come from a trading background. They were a hedge fund. Um, they were called High Fly Capital Management, and they were basically just doing it for as a bit of research. Yeah. Smart folks that were working there. And, um, it's it talks one of those things that when you limit access to whether it's hardware or software, uh, in this case, it was a bit of both to countries or to innovators or anyone, wherever they are, just the ingenuity that comes from that. Because China was limited in terms of what kind of tools it could access. And look, they're like, okay, hold my beer. We're going to look at this from another way because we need to improve. We're going to create this thing. And they did. And now others have followed it. So it is kind of crazy the story. Interesting side note there on the on the limitations. And obviously the sanctions are being put in place from a graph from a GPU perspective. Um, Nvidia has been, you know, updating some of their recent GPUs to export into China. They've like obviously tweaked them to make sure they're compliant. Um, but there's been AMD has also stepped up to the plate now and is starting to roll out and sell stuff into the China market as well, which is a lot of pressure on the Nvidia space and a couple of other big players. So I think that's going to yeah, that opportunity is really going to accelerate how they um, you know, how China starts developing and focusing if they're able to get their hands on improved GPU technology that isn't Nvidia focused? You know, that's a big win for them. Absolutely. Uh opera next. Yeah. Last one on my belt. We've got opera browser. This is just a small one here. They've um they've just made a nice little AI update. I don't know if. Are you a big, uh I mean, are you an opera user? I'm personally not. I'm not. My dad is. Your dad is. Yeah. Yeah. That's awesome. He uses all the browsers, I love it. I'm like, dad, why do you have Opera and Brave and Chrome and what what what? Anyway, there's a lot of things. Like, a lot of people are coming out and pushing harder and harder to say, don't use chrome like it is. I mean, it is a data reader. It's there. Just like, come on. Like we talked about before with OpenAI trying to have your login, um, understand where you are. Let's watch a bit of this. So what is an AI agent? You probably have an image when you hear the words AI agent. Maybe it's ChatGPT or robot. Bite me. Or maybe you think of a dystopian future where the machines take over the world. Humans love their stories about threats almost as much as they love pictures of cats. AI agents are systems that fully autonomously within the city limits. That's exactly what Opera Neon is, and its Antic browser that can act on your behalf. First, there's me on chat. The conversational layer that helps you talk to these agents. They get into it more and more there. But what a great add. I do love this kind of things, but, um, it reminds me that just I digress AI sideshow here, but there is a show called. Uh murderbot. Yes. Is that what it's reminding you of? A little bit at the start. Third episode I'm up to. I haven't watched it yet, but Joel Kinnaman, it's on Apple TV or Amazon. But have you read the books? No, I haven't read the books, but I had a couple of friends who were talking about they were saying that books were amazing. So check it out. I don't know if you get the Murderbot vibes. Yes, 100%. But yeah, interesting that, um, what they've got here with the browser operator agent. Yeah. I mean, so like, obviously you're a big, um, brave browser user. I use brave on my systems as well. This one, I just stupidly have Chrome, but I'm not really logged into anything. Um, but so what they're able to do now is actually do a lot of that agent style workflows for you with straight within the opera, um, neon browser. So three new buttons on a sidebar, you can see they can do, chat, do, and make um, the chat presents a chatbot interface, get all the answers and queries and ask for information about the web pages. Um, and essentially, yeah, it just creates a new, unique experience and competing against the other browsers out there to be able to do whatever you want. That's great. Um, yeah. The whatever industry you're in, um, browsers or like creating something else, like bringing I more and more into what you do, I think, is there's still a lot of opportunity. There is what I want to say, because if you look at the majority of the web, it's still not AI ified. Exactly right. Like you go to government websites, councils, you go to like software websites. You don't not all of them have chatbot things. And I would argue that a lot of them could because they've got the money they spend a lot on, like marketing and stuff. Why can't you have something where it's just an interactive bot that helps you navigate the website, or just AI powered features when you're watching videos and stuff on here? Why don't I have with TechCrunch? Here's this video and I can click some buttons and just ask more about this. Yeah. Why, why, why. And we should continue to ask why. Um, and maybe you can do something about it, folks, because the next thing I want to show is that there is the launch just overnight of the bolt was largest hackathon, $1 million plus Loss with a plus in prizes. So, um, the submissions. You've got a month to do this. So it started today, May 30th. Um, overnight in the US, there is 52,959 participants. Now, maybe some of those will be teams. That's just me registering. Yeah, I think uh, yeah. This what a great initiative. $1 million really does. Um. It'll change. That'll change someone's life. Yeah. Yeah, exactly. So, you know, vibe code away. Um, but just remember that you people will probably come to this for the first time. Having never used bolt. My advice would be because you've gone through this as well. Give them advice. We're competing in this. No no no no, we got to do it. It's for the good. Our ideas. Ideas are free, folks. It's how you execute it and stuff. But like, um, the, the advice that I'd give is that you might get frustrated. You will get frustrated, actually, if it's the first time that you're using it. But just try to get over that hump. Talk to others there, discussion forums and stuff like that. And the thing is, um, sometimes, your boat won't be able to do the things that you want it to do, but there are ways around it. Talk to others that have a bit more experience and stuff. You can certainly get there. We built some crazy things that I'll show later on with that, but looking at this. So, um, one winner will get, uh, some big prize. Second place will get 75,000 USD, 25,000 AWS credits. Fourth place. Oh no, third place gets um, 50,000 USD. So there's one big winner of something. There's a grand prize of 100,000. Second is 75, third is 50, fourth is 25. Fifth is 22,506. All the way down to 10th. And you still get 10,000 for that. So look, it's not a one size, one winner fits all, takes all kind of thing. There are um, the various things. There's even, look, a $10,000 prize in the APAC region. So for a regional highlight where we are, that's good. You know, there's some good stuff there. So in any case, um, we will be doing something. And because you have to create a video on this, we'll be showing you folks like what Chris and I. And we've got another partner going to create. We think it's going to be, um, an interesting, useful kind of thing, but, uh, yeah, really interesting stuff here. Right. Like, I think even showcasing how some things are created, like it's one of those interesting ones where there is a certain way to use it that will create a benefit for you as opposed to just, yeah, vibe coding straight into the chat. It's like it's kind of what I love about it. And even you've been teaching me a lot about it. Mark. So this is this is not me talking out of my own free knowledge base. This is me learning from from the specialist that is to to my left here expert. Um, but yeah, so definitely I think we'll share some stuff and really get people on that journey. Yeah, pretty much very powerful tool. And hopefully when they get Claude four into it, oh my god, it'll be even more game changing. One more thing on that is that if you have registered, um, you get a whole lot of things in your launch pack. So you get, uh, access code. I'm already on Bolt Pro, they call it Pro. I don't know what that means actually, because I'm on a higher level of subscription. So I think there's different levels of Bolt Pro. Um, like depending on how many credits that you, you want. Uh, do you want to like bring that up in terms of pricing? Yeah, yeah. There's the Pro 50, Pro hundred, Pro 200. I am on the Pro 200 because I do, you know, use this uh, quite extensively. And what I found was that if I was on the lower ones, I just kept on updating and updating and updating, like, as in buying refills of credits. So, like, it just didn't make sense. And, uh, for anyone who's wondering, the, uh, the 5000, 200 is a representation of the price. It's the Pro 50, 50 bucks Pro 100, 100 bucks, not a representation of the tokens. And the tokens are about half of what the cost is. Yeah, in terms of million. But the tokens, they could make that whatever it is they want, it's not a one for one that, you know, uh, a million tokens means a million tokens of Claude. Um, it's. Yeah, there's no no, they could have made it a 10,000, However, they perceive their tokens as exactly. But you use tokens when you're in bolt anyway. They give you free tokens for that. They give you, uh, with their partners, um, free access to, well, a free domain for a year. They give you free, um, website like, uh, not website registering, but like hosting, website hosting. So this is if you're registered to the. That's awesome. Who do you know who that's through. Uh, plenty. I don't have all the details here. Maybe it's in here somewhere. Um, anyway, I'll find it. There's there's resources here. Yeah. That's great. 52,000. So free hosting, free domain registration. Yeah. I'll find it in my email, which I don't want to show right now. So yeah. Um, very cool stuff out there. But keep an eye out for that one. And can people register still? Is it still too late to register now that it started? Or is it once I actually you know what, I don't know. Um, I don't see why they wouldn't. I don't see why they wouldn't. So, uh, what is it called? Bolt. Bolt? Hackathon. Hackathon. Let's see if we can jump on in hackathon 30th May. Yeah, that's right, the hackathon dev is the um is the the is the domain. We'll pop, we'll put it in the show notes. You can still register, join hackathon, see what happens, sign up with Google. Why not on Chris's screen we can see that like it's all there. So it's it's still fine. You can still register. Yeah. So you can it just means that you may have less time to be able to deal with it and get all the free stuff, knowing how quickly you can do stuff in bulk. I think a month is even too long. Um, I do think that what they're, you know, it will help you flesh things out and stuff and that that's great. But typically hackathons, if if you're not used to that, folks, they they start typically on a Friday. People code away on like a Saturday present on a Sunday. So they're usually quick things. So given that this is a 30 day thing, either it's to help, uh, bring in more people that would build with bolt, which is just great for them. But um, and also the partners, but uh, also means the quality has to go up in terms of what you normally could do in like a day or two will be far higher quality. And um, yeah, we're thinking of this in terms of real world value stuff that will actually impact people. And there's there's like 50,000, as you said, plus people who are registered, I can guarantee you that there are some incredible minds there that are going to be doing some amazing things. So I'm actually pretty excited to see what the outcome of this is going to be. Because you don't you don't. I mean, people share stuff online. They show a lot of cool things are doing. A lot of them are games. Some of them are just like spreadsheets and tables and finance platforms. So this is one of the first times I think we'll start to see publicly some, you know, some really interesting, innovative stuff. What a great innovation. Yeah. Like imagine if, uh, folks in blockchain, you know, maybe they have done that like create something real world that blah blah blah blah blah. Uh, here's $1 million in prizes and stuff. Yeah. Anyway, there are groups out there that are doing that. So, you know, we'll talk more about that later. But, uh, yeah, that's it for me. That's it for the show. Yes. Uh, thank you so much for joining us. Yeah. Have a great week, guys. We'll catch you next Friday, I guess. See ya. News flash. Oh! Oh. You're still here. Oh, good. Because there's one more thing that we just literally came up while we were doing and recording our podcasts. So Perplexity Labs just launched with their pro AI suite, a bunch of awesome new features. Um, specifically, we're looking at AI powered spreadsheet generation. So they're now leveraging advanced AI capabilities to generate sophisticated spreadsheets, data visualizations, and even requiring approximately ten minutes to complete these tasks. So it's pretty pretty rapid for a lot of the data that you can push in there. Um, so you can create custom code, organizing data and implementing formulas, systematically arranging and generating files, including code, images, charts in dedicated tabs for easy access, supporting integration with other tools through automation. Templates that connect perplexity with things like Google Sheets, which is really which is great because I'm a big sheet user. um enables users to extract and format citations, ensuring research and credibility. So conducting a lot of data analysis and also creating dashboards to support those tools and insights. Um, which is so anyone to be able to access that both on if you're on any of the pro level, uh, subscriptions. Um, so as I mentioned, interactive dashboard creation, ten minute research workflows, uh, disrupting search and landscape as they're claiming here. So actually, you know, they've been a very search focused first tool, even going into the e-commerce space. It's now switching things up, focusing on, you know, finance spreadsheets, data sets, um, really expanding the capabilities there. Definitely something to check out. Great to hear. Perplexity. Still keeping up the pace with things in the market. And that's it for me. Enjoy this. Tech trends blazing fire are are rising to the sky. Digital world we live.