Utilizing Tech - Season 7: AI Data Infrastructure Presented by Solidigm - 06x09: Modern Application Development Using AI with Paul Nashawaty of The Futurum Group

Episode Date: April 15, 2024

AI is accelerating application development and modernization in many ways, but developers are just ramping up their use of the technology. This episode of Utilizing Tech includes Paul Nashawaty, who f...ocuses on application development at The Futurum Group, discussing this topic with Allyson Klein and Stephen Foskett. Use cases for AI include documentation, chatbots, data integration, programming co-pilots, and more. Regardless of how AI is used, organizations must accept that they are ultimately responsible for the products and outputs produced. Accelerating testing and thus the entire DevOps release cycle is one area where AI is making incredible growth. Another popular concept is retrieval-augmented generation (RAG) which brings existing datasets to generative AI to improve results. There is currently a lack of confidence in AI-based solutions and concern about the complexity and level of effort required to bring them to market, so vendors must help make deployment easier and better integrated. Product vendors are addressing this by delivering solutions with partners and popular platforms and frameworks. Open source tools and open data are also helping to move AI technologies forward. Developers want invisible infrastructure and platforms that just work, and the emerging AI PC segment promises more processing power on the desktop. 2024 is shaping up to be the year of AI in application development. Hosts: Stephen Foskett, Organizer of Tech Field Day: ⁠https://www.linkedin.com/in/sfoskett/⁠ Allyson Klein: ⁠https://www.linkedin.com/in/allysonklein/ Paul Nashawaty, Practice Lead, Application Development Modernization at The Futurum Group: https://www.linkedin.com/in/paulnashawaty/ Follow Utilizing Tech Website: ⁠https://www.UtilizingTech.com/⁠ X/Twitter: ⁠https://www.twitter.com/UtilizingTech ⁠ Tech Field Day Website: ⁠https://www.TechFieldDay.com⁠ LinkedIn: ⁠https://www.LinkedIn.com/company/Tech-Field-Day ⁠ X/Twitter: https://www.Twitter.com/TechFieldDay Tags: #UtilizingAI, #AppDev, #AI, @TheFuturumGroup, @GestaltIT, @TechFieldDay, @SFoskett, @TechAllyson, @PNashawaty,

Transcript
Discussion (0)
Starting point is 00:00:00 AI is accelerating application development and modernization in many ways, but developers are just ramping up their use of this technology. This episode of Utilizing Tech includes Paul Nashawati, who focuses on application development at the Futurum Group, and he's discussing this topic with Allison Klein and myself. Learn more about the use cases and the practical application of AI in modern application development. Welcome to Utilizing Tech, the podcast about emerging technology from Tech Field Day, part of the Futurum Group. This season of Utilizing
Starting point is 00:00:33 Tech is returning to the topic of artificial intelligence, where we will explore the practical applications and impact of AI on all areas of enterprise technology. I'm your host, Stephen Foskett, organizer of Tech Field Day, and joining me as my co-host this week is my good friend, Allison Klein. Welcome to the show, Allison. Hey, Stephen. It's great to be back. It's always nice to have you here. You've always got something good to say. And this week, we are talking about the impact of AI on application development.
Starting point is 00:01:05 I can't wait for this topic. You know, one of the things that I've been following is just how DevOps teams and application teams are looking at AI as a transformative force in the way that we think of everything about application development and deployment. And I want to learn more. Yeah, I can see a bunch of different ways that it could impact us. I mean, you know, number one, AI co-pilots are helping developers write better code. I'm also, of course, seeing a lot of AI assistance in terms of a lot of the back-end processes,
Starting point is 00:01:43 everything from, you everything from source control, the documentation, and of course, there's the whole operations aspect of it. And then there's all these machine learning and AI modules that are being integrated into applications and developers are being called on to implement that stuff. One person that I know that knows a lot about this topic is my colleague here from Futurum, Paul Nashawati. Paul focuses on modern application development. Welcome to the show. Thank you, Stephen. Thank you, Allison, for having me here. It's a really exciting time, really exciting topic to be talking about app dev and application development and how AI impacts it.
Starting point is 00:02:20 So my role, as you mentioned, Stephen, I am the practice lead for the app dev practice at Futurum Group. So I focus on application modernization and application development. And if you're interested in this conversation, Paul and I are actually going to be hosting a application development focused Tech Field Day event. The first ever app dev field day will be the end of May. We would love to have you all tune in and join us for that conversation. But I guess kick things off, Paul. You know, you're an analyst. You're focused on this space.
Starting point is 00:02:52 I don't think there's anyone who's better positioned to talk about this. Tell me how AI is impacting modern application development. Well, Stephen, you know, AI is impacting modern application development in a number of different ways, right? When you look at it, it's, you know, there's acceleration of modernization practices, that modernization is top of mind for every CIO, right? When looking at what's happening with the infrastructure, whether it's heritage environments or heritage applications that are moving to cloud or moving to cloud native environments or cloud ready environments. All of this really requires that faster kind of movement and approach to basically modernize, right? And when you ask these CIOs what their number one challenge is,
Starting point is 00:03:36 it really does come down to skill gap issues and then is complexity and there's challenges. So AI is being used to address some of these challenges. But it's interesting because what we also see in our research is even though AI is really at the early stages of assisting these modernization efforts, we see that only 18% of organizations are using AI in their production applications in our recent research. So it's an interesting kind of place, but we also see that 27% of respondents to our survey are indicating that they are trying to use and understand how AI can help with these experts.
Starting point is 00:04:13 I can imagine that IT organizations are being bombarded with requests right now from different lines of business saying we need to integrate AI into how we do X. What do you think is the challenge for organizations to actually move? Is it skill set gap? Is it tool gap? Is it infrastructure gap? Or is it all of the above? They might, yes, from my perspective, all of the above, right? And of course, that's an analyst response because I can say that, right? It really does impact all of the
Starting point is 00:04:52 different aspects of the development cycle. I think the way I would look at it, Allison, is frame it up into how is AI being used to help address these projects, right? And when I look at it, I think of it in the context of faster code creation, or I think of it of improving code. These are use cases that the developers are looking at, generating documentation, right? That's a great use case for AI. And also customer satisfaction, right?
Starting point is 00:05:17 We've also seen that chatbots have been in play for a long time, but chatbots are not always the best way of kind of delivering, but with AI, Gen AI, we see that, you know, that customer experience is really being improved. Actually, we see that 35% of respondents are using AI to do so. So that's an interesting kind of perspective as well. To actually pivot a little bit, I would also say from a tech perspective, AI is also looking at ways of integrating these new LLMs. But this isn't new. Applications, as you modernize and you move forward, organizations have spent a lot of time scrubbing and building their data pools and their data lakes that they've already had today. They don't want to give that up.
Starting point is 00:06:05 So attaching those data lakes to their new intelligence or new AI models, that's incredibly important for a lot of organizations. Yeah, Paul, that was one of the topics that we discussed at AI Field Day specifically was this idea of using LLM as the user interface, as it were, and connecting it with structured data sets on the backend to basically surface existing data better and help serve customers by leveraging hard structured data sets. I think that that's absolutely something
Starting point is 00:06:40 that we're gonna see more of in the future. Obviously, data lakes and data warehouses and data lake houses and all these kinds of things, that has been part of the modern application world for quite a while. But I think that AI is really going to accelerate that because people are so worried about what LLMs might do. No matter where these things are being used, though, is that a concern for application developers in terms of using AI technology and the fear of hallucination and all that? Yeah, it's a great question. When you think about the data structures that are being used, a lot of these heritage applications are being encapsulated as systems of record to be used for modernized approaches. So you'll build new front-end applications, maybe with React application in the front end,
Starting point is 00:07:29 that attaches to the existing system of record in the back end. What's incredibly important is the right calls to make to those data sources so you have the right information that's presented up to that application. If that information is not presented up correctly, you have a couple of issues. One, you could have hallucinations, as you mentioned, right? But you also have the issue of incorrect data or silos of data, right? Not having a holistic view across your data set. That is very bad, right? Because now you have a partial data set of what you're looking for. And for years, organizations have spent time to build these systems so they have a holistic view of their organization. And now to put an AI front end on it, you know, is they don't want to break their existing systems they have in place. So that's important to kind of think about. your back-end LLM or your private LLM to your private AI, to your new systems and new
Starting point is 00:08:29 modernized applications, it's incredibly important that you don't like co-mingle that with public information because now you have, you run the risk of sharing your IP externally, right? And if you want to share, you know, that could be run into a whole slew of issues. So there's also that. And then the final point I'll make is ethical concerns and governance concerns. You want to make sure that you're in compliance
Starting point is 00:08:54 within your organization. So if you're running checks and balances against your AI, you need to be able to make sure that you're in compliance and governance within your own organizational needs. I mean, we saw recently an airline that basically went out and did some weird things that happened and they didn't honor those things that happened. They said they blamed it on the AI. Well, that's not acceptable. Right. You can't just say, well, my business did this. It used AI and blame it on the AI. That doesn't that doesn't work.
Starting point is 00:09:23 So that's one of the things that organizations have to think about. What do you see the LLM providers that are training these massive models on public data, providing in terms of tools to integrate LLMs as quickly as possible? And are there more tools needed in this space to enable DevOps teams to take advantage of the tools?
Starting point is 00:09:49 Yeah, I think if you look at the tools and the stacks that are in place, businesses are about accelerating their application release cycles, right? So the CICD pipeline is that infinity loop of moving the application and making sure you have that agile development process. What we see in our research is 24% of respondents in a 378-person survey indicated that their organizations wanted to release code on an hourly basis, on an hourly basis, yet only 8% were able to do so. And part of the reason behind that is because the release cadence has to happen very quickly. So when you're looking at the DevOps tools and these tools, these learning models that are in place, these, you know, the applications that are in place, everything I mentioned earlier about governance, compliance, regulation, all needs to be taken into consideration before you push it out the door. We we haven't even mentioned security, right?
Starting point is 00:10:45 So if we look at security as another factor, that's another area. The other thing I would also mention is testing. In the CICD pipeline, we see testing as a major factor. You know, that's an area that many organizations will say, well, we have an agile software development methodology. We have new sprint releases or code releases every two weeks. We'll just address the bug fixes with the next release. However, that's not a really great example because you're using
Starting point is 00:11:10 your client base as your test bed. And that's the fastest way to lose those clients. Yeah, I think that that's a real good point, Paul. I remember you bringing this up to me as well, that testing is one area that AI can really, really accelerate things because it can not just follow a test plan, but basically create adaptive test plans based on whatever is happening, whatever is being produced, right? Absolutely. And I was just on a briefing yesterday with a client
Starting point is 00:11:40 that is actually potentially coming to the AppDev Tech Field Day event, which is going to be exciting. But that client focuses specifically on accelerating test cycles for the DevOps release cycle. And that's exactly what organizations are looking for. We typically hear the term shift left in the context of security, right? You know, shifting left from day two to day one and day zero um what testing does is shift left the testing cycle from day two from the release side to the um to the to the build and earlier in the cycle so that is an area that i'm
Starting point is 00:12:19 seeing a lot of growth in the vendor world of accelerating the testing so moving uh focusing on day zero and day one, so the build and release side versus on the operational side after that. And it occurs to me that testing is one of those areas like documentation that let's say developers have not always been that enthusiastic about. So I think that they may be enthusiastic to have an AI solution to help with that. Oh, for sure. This is one of those areas that basically you're trying to shift your skill, your skilled resources from reducing those redundant tasks. And testing is one of those things where if you're putting in a test harness,
Starting point is 00:12:58 an AI solution to remove those redundant tasks from that cycle, it really helps, not just helps the accuracy of putting out the application faster, but also the enjoyment of the developer's job, right? The developers want to innovate. They want to build new code. They don't want to live in maintenance mode. That's not a fun place to be, right? And living in that mode and testing is just,
Starting point is 00:13:20 it's redundant and tedious. So automating that and having AI as a way to kind of get you there, actually what we're seeing is 36% of respondents in our recent survey indicated that shifting skill resources from redundant tasks, shifting those skill resources away from redundant tasks to use AI to take care of it. So it's definitely an area that organizations are looking at, even in the early stages of AI. I've been hearing a term called retrieval augmented generation quite a bit in conversations.
Starting point is 00:13:51 Can you just define that and break that down and fit it into what we've been talking about? Absolutely. So when we look at RAG, retrieval augmented generation, we look at the combination of using predictive and generative AI to kind of produce the results that you're looking for, right? So basically combining those data sets and the results to accurately deliver what you're trying to produce using AI gives you a more up-to-date set of content and resources to produce that AI solution, or I should say that output that you want from a content perspective,
Starting point is 00:14:30 but have a real-life backend kind of support to it. When you think about where IT organizations are, you've painted this incredible picture of all these things that need to be thought of. You've also stated that a low percentage of IT organizations are fully actioned on all of these items. What do you think it's going to take to get over the chasm, if you will, and move DevOps teams into the mindset of AI-centric application delivery? Yeah, Allison, I really like that question because I think about it in the context of maturity. There's going to always be stages of maturity for organizations where
Starting point is 00:15:12 they're going to take their approach and how they deliver based on their own needs. So for example, I mean, when we look at organizations that have DevOps teams, they have DevOps engineering teams, as well as platform engineering teams. Do they have these different isolations? Are they utilizing SREs for their whole complete site view and such to have a holistic view across the entire organization? Or are they really just an IT shop that focuses on the line of business with nothing in between? That's a maturity model right there, right? So it depends on the maturity.
Starting point is 00:15:46 But to your question, all the things that we talked about so far up until the end of this conversation today, all the things we talked about are areas that organizations are touching, regardless of their maturity. They may be doing it better or worse, but they are touching it. And what AI does is allows for it to be faster and more reliable to get it done. One of the things that I believe that there's a challenge that I hear quite a bit from the organizations, not just the vendor side, but the organizations that are running it,
Starting point is 00:16:20 is there's a lack of confidence in the solutions that are in the market. So they want to have stronger confidence from vendors that are producing these solutions. The other side of it is, as I mentioned, CIO's focus is application modernization, right? Need to modernize. Whether it's infrastructure or application, they need to modernize. When you put in a solution that says, hey, we've got this AI solution that's going to help you do that. They look at it and go, this is another resource I have to put behind this. So how complex is this? How much work do I have to put in?
Starting point is 00:16:51 Do I have to hire another body? Do I have to look to a service delivery partner to do this? You know, there's a whole number of factors here. So my advice to vendors in this space is to provide the big, the easy button, right? The big green button that says go, right? If you can give that frictionless deployment of how to get DevOps teams more productive
Starting point is 00:17:10 and increase that velocity, even though developers really don't like to use the word velocity because they're doing as much as they can. DevOps teams do want to get the code out the door faster because that's their business KPI. And if you can have that easy button to do so, that will help accelerate those goals. Yeah, I definitely am seeing that as well. It seems like the best positioned vendors, and frankly, I have to say, some of the product
Starting point is 00:17:35 vendors seem to be kind of left behind because they're not approaching it this way. But the best positioned ones are the ones that are leaning into integration points with other products and other frameworks beyond the scope of their product. And even folks that are leaning into compatibility across the same segment and ones that are very, very partner driven. If you look at the companies that are most successful in modern application deployment and development. It's the ones that are very, very friendly, open, and partner-driven. And frankly, that's a good way to be anyway. Do you agree with that? I do, because when you look at, we talked about skill gap as a major challenge, right?
Starting point is 00:18:18 When you look at skill gap as an issue, Stephen, a lot of times organizations will say, well, how can I accelerate this? I need to hire more people. Well, even if they can find people to do the job, it takes about nine to 12 months to get that team productive to deliver. So that's, that's pretty much outside the window of what they're looking for. But that partner ecosystem that you mentioned, that's where the service delivery partners come in and can accelerate those, those business objectives and goals quickly. That's when you start working with digital services partners or service delivery partners to get the goal done. One thing I will say is, you know, with the,
Starting point is 00:18:50 we saw a big acquisition that occurred in the virtualization space recently. There was a reduction in the amount of products that they offered. One of the things I like about that approach is if you give a bag of bits to organizations, you've increased the complexity, which means that you're not going to accelerate their time to value. You're not going to accelerate their time to deliver these products. So even if you're working with a partner ecosystem, you have to put it together. Reducing the amount of bag of bits or amount of pieces to make it work so it's a frictionless deployment, even for those service delivery partners, is an advantage for a lot of vendors. I'm so interested to see how this rolls out.
Starting point is 00:19:34 And Stephen, I'm so glad that you brought up open because it really does beg the question, how is open source and open data? I mean, open data is the new topic that enters into the fray with this. How will they reshape industry? And how are organizations going to lean towards open source to take advantage of shared capabilities? The announcement that came at AI Field Day 4 around Gemma is just an example of a large leading AI company taking what DeepMind has created and providing it out in the open source community. Do you see a lot of action
Starting point is 00:20:15 in that space, Paul? And how are broader organizations responding to things like Gemma? Yeah, well, I'll answer it in kind of twofold. One, the open source community is, I should say this, organizations in our survey that responded to our recent survey indicate that they want to work with vendors that sponsor open source projects. But 68% of those respondents want to work with vendors that they want to use open source, but they want commercial level of support to go with it. They don't want to go at it alone. But the open source piece is incredibly important because it's hardened and tested and it's in the
Starting point is 00:20:56 market, you know, it's tested by the entire community. But to Steven's earlier point, it's also very much part of that partner ecosystem. There's that integration across the entire environment. So that piece is incredibly important. With regards to Gemma and the approach there, I certainly see that as the step in the right direction. I think that that's where the organizations are going, especially when you start adding in development efforts and building code and then, you know, building that, putting those practices in place of ethical AI and such. Those all kind of play into that whole ecosystem. Yeah, I think that the GEMMA announcement by Google, as mentioned, I mean, the importance of
Starting point is 00:21:40 that, well, number one was that it brings, you know, good models to bear for people who are developing AI applications. But more importantly to me is that they're releasing it in a way that's ready to use. So you've got Kaggle notebooks. It works with, you know, Hugging Face and it works with, you know, NVIDIA GPUs and the kind of things that developers are needing to integrate with it. It's just all there. And I really like the fact that Google announced that. And I'm really excited to see so much other open access out there for these models. Because ultimately, I don't think that application developers are going to develop their own LLMs from scratch. I mean, do you see that happening, Paul? No, actually, I don't think that application developers are going to develop their own LLMs from scratch. I mean, do you see that happening, Paul? No, actually, I don't. But I do want to touch a little bit more on your comment there.
Starting point is 00:22:32 One of the things around, you know, providing that frictionless approach, providing those templates, so to speak, is important. But not to get too much into the nuts and bolts and weeds. When you start looking at this, the path, like if you're dipping your toe in trying to get going into these AI solutions and such, do you use a CPU or a GPU? Like what size do you use to kind of get going, right? That's an important kind of decision. Now a developer's not gonna to know that, right? Or think about that too much.
Starting point is 00:22:59 They're gonna think about it in the context of saying, what do I need to kind of get myself going, right? But the underlying pieces, well, what if the infrastructure only has a large number of CPUs? And can those CPUs be used for AI initiatives? Sure, they can, right? Are they the best? Probably not. Depends on what level of you're growing.
Starting point is 00:23:17 But that's an architectural decision that's made. And it's also transparent to the developer, which is important because the developer wants that invisible infrastructure. They don't want to know what's happening underneath. They just want to know they have the resources to go. To your question, Stephen, on LLMs, no. Developers have a lot of constraints or a lot of pressures on them, especially when you're looking at refactoring or building that new code. If you're building new, you're trying to say refactor, if you're refactoring, you have a lot
Starting point is 00:23:52 of work to just get the business logic right, right? So if you're trying to build the business logic and whatever database in the back end that's there, data sets that are back there, the API calls to connect into it are important, right? Which leads to a whole different conversation about use cases and calls and costs, especially if you're going from cloud to on-prem, et cetera. There's a whole bunch. That's a whole different conversation probably for another hour-long discussion. But for the point of this discussion, when we look at developers using and building, I don't think they're building their own LLMs to do this, because they're using their own data sets. They may have some test labs and some test cases for that. Sure, that makes sense. But again, for organizations listening to this discussion today,
Starting point is 00:24:38 they really need to make sure that the organization has compliance, governance, and rules in place so their proprietary information does not get out into the public domain. Acceleration seems to be a common theme. And one of the things that I think about in this is watching how quickly the large AI players are moving. I mean, the speed of innovation is something after 25 years in the industry I have never seen before. One question that I sit with is, are enterprises going to be able to keep up with this pace? And do you see enterprise perspectives changing to actually start implementing technology at a faster rate?
Starting point is 00:25:21 Or do you see them staying with, you know, what tends to be more of cautious perspectives in terms of the implementation of technology? Allison, great question. What we see in our research is organizations are tasked, specifically development organizations, are tasked to do two to three times more work in application development today with half the staff they had just a few years ago. So to your comment, the acceleration of rapid code development is just ramping up and it's like a hockey stick. And without the automation tools, you know, like an Ansible or Terraform or something that kind of allows you to do automation, organizations that are not using automation are going to fall behind.
Starting point is 00:26:09 And the challenge that I see, Allison, is it's not so much about it, not necessarily just about a competitive advantage. It's also about security impacts, things that happen to applications that happen really rapidly. If organizations cannot adjust and make changes that quickly, that's an impact to the business's integrity, right? So now you have either it's bug issues or you have security breaches or you have just maybe downtime, whatever it may be, but you have the customer impact and that's what's really what's going to happen. So it seems like 2024 is shaping up to be the year of AI integration into the enterprise. And I can't wait to see the use cases that come out of that. Because I think it's just the amount of opportunity for innovation across lines of business, across industries. I think we're going to be breathless
Starting point is 00:27:05 in what we see. I want to know if 2024 is actually the year, or do you think it's going to take a few years for this all to materialize, Paul? Yeah, Alison, great point. I see it as this way, just to kind of wrap up what we've been talking about here 2024 ai is in its infancy right especially when you look at um the impacts of organizations ai is not new right we've seen the organization's been around for for many many years the actual practical use cases of ai have been accelerating at the end of 23 going into 24 and it's taken off like a rocket ship and this is where you know i you know I don't think you can, you can, I don't know. You, you,
Starting point is 00:27:50 I think that the key is in the ignition and it's going right. I don't think you can't, you can't, you know, shut that engine off. It's going. So I think that we're just in the early stages right now, especially with rapid code development with testing, with the CICD pipeline, with developers. But just overall, as we talked about in this discussion today, the impact and release of what organizations are trying to do with application development is incredibly fast, and it's only going to get faster. So I just see AI just taking off and helping with those solutions.
Starting point is 00:28:23 But it's an enabler. It's not the end goal. Yeah, we've already seen so many new initiatives being announced. You know, at GTC, everywhere we go, you know, AI was obviously a big focus at Mobile World Congress earlier this year as well. You know, we certainly expect to see a lot about it at Cisco Live and HP Discover and all these other events. It just seems like there's no stuffing the genie back in the bottle. And not only that, but these things are going to be, you know, they're going to be out there no matter what. You know, another aspect without bringing up too many new topics at the end here.
Starting point is 00:29:00 Another thing that occurs to me is, you know is the emergence of the AI PC. I think that Apple and Microsoft are going to be integrating much more support for AI on the desktop, on the workstation. And that is inevitably going to pull application developers into using this technology simply because it's right there. It's at hand. It's in the operating system. It's in their favorite development platform. They might as well use it. And to me, it's an exciting time to watch. Certainly, I don't think that this is going to slow down anytime soon. In fact, I would say, Paul, I think that when you rerun your survey, maybe next quarter, you're going to see very different numbers. And I can't wait to read that report from you. Absolutely, Stephen. I would agree absolutely with that statement. Well, thank you so much for joining us, Paul. It's been great
Starting point is 00:29:51 to catch up with you. I am privileged to be able to talk to Paul all the time. And as I said, we'll be seeing you at the end of May at AppDev Field Day. Tell us a little bit more about where we can continue the conversation. Oh, absolutely. So there's a lot happening with this discussion. We have a lot of research briefs and research notes that are out on Futurum.com. But you can find us both on social in our LinkedIn and X, as well as from our platform. And we'd love to continue the conversation on those social venues, as well as you can reach out to me at thefuturumgroup.com. Allison, how about yourself? I'm Allison Klein with the Tech Arena. You can see me publishing about AI and all sorts of topics
Starting point is 00:30:35 from cloud to edge on thetecharena.net and connect with me at Allison Klein on LinkedIn. Absolutely. And for me, you'll find me here on Utilizing Tech. We're actually getting a new season ramped up. We've already even recorded some of the episodes, amazingly enough. And so we'll be launching a seventh Just go to techfielday.com to learn more about that. And of course, you can catch our Tuesday podcast and our Wednesday news show to keep up to date on everything that's going on. Thank you for listening to Utilizing AI, part of the Utilizing Tech podcast series. You'll find this podcast in all of your favorite podcast applications, as well as on YouTube. If you enjoyed the discussion, please do give us a rating and a nice review. We would love to hear from you as well. The podcast is brought to you Thank you. or find us on X Twitter and Mastodon at Utilizing Tech. Thanks for listening, and we will catch you next week.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.