Utilizing Tech - Season 7: AI Data Infrastructure Presented by Solidigm - 1x1: AI is Getting Real with @AndyThurai

Episode Date: August 28, 2020

In this pilot episode of Utilizing AI, Stephen Foskett and Andy Thurai discuss the reason for the podcast and consider where we go from here. AI is getting real, moving out of academia and hyperscale ...and into the enterprise. Businesses are adopting AI in strategically, and IT companies are deploying AI technologies in their products. This trend is quite obvious to Stephen at Gestalt IT, as many Tech Field Day companies present AI-enabled products for network monitoring and management, security, mobility, and much more. Today's infrastructure applications are focused on applying machine learning to large datasets, finding needles in haystacks. But tomorrow will see much more exciting applications. This episode features: Stephen Foskett, publisher of Gestalt IT and organizer of Tech Field Day. Find Stephen's writing at GestaltIT.com and on Twitter at @SFoskett Andy Thurai, technology influencer and thought leader. Find Andy's content at theFieldCTO.com and on Twitter at @AndyThurai Date: 08/28/2020 Tags: @SFoskett, @AndyThurai

Transcript
Discussion (0)
Starting point is 00:00:00 Welcome to the Utilizing AI podcast. I'm your host, Stephen Foskett, and we are here learning about AI and the enterprise. I think you'll find that this podcast is a little different from some of the others because we're not so focused on the academic aspects. We're not going to talk about blah, blah, blah. We're going to talk about how AI gets real. So before we start, let me introduce my co-host here today Andy tell us a little about yourself sure thanks Stephen I am Andy Torai at th you are AI and funny enough as someone pointed out I have AI in my last name thanks to my parents so I'm the principal and the founder of the field CTO calm that's the field CTO calm I'm an emerging tech strategist advisor
Starting point is 00:00:53 and practitioner of emerging takes particularly a IML edge IOT and cloud technologies glad I'm the show thanks Andy and, I'm Stephen Foskett. I am the founder of Gestalt IT and organizer of Tech Field Day. And you will find me online at S Foskett pretty much everywhere. But for the purposes of this, you'll of course find us at utilizing-ai.com and on Twitter as utilizing underscore AI. So let's get the conversation started here and just talk about, I guess, the different ways that AI affects us. I think most people are quite familiar with, I guess, your Siri or your Google Assistant or Alexa or whoever. That's not what we're talking about here.
Starting point is 00:01:46 We're talking about AI in business and in the enterprise and in the data center and the cloud and so on. Basically, having this stuff serve business goals rather than that. We're also not talking about what a lot of other AI podcasts talk about, which is basically academic stuff, things that you'd write papers and make posters about, you know, models and, you know, doing calculations and so on. Okay, that was pretty funny. My Siri actually just introduced and said hello, which I think is honestly maybe the perfect thing to have happen in an AI podcast. I'm still thinking. Because you keep saying serious. You're invoking the rights.
Starting point is 00:02:31 That's funny. So Andy, you and I were talking about this, the whole world of AI and how it works and how it can come to the enterprise and what it can do. So talk to me about that. How does AI get real in the enterprise? So a couple of quick points. The first one I wanna point out,
Starting point is 00:02:52 even though you said Siri and Google and others, I don't have Siri so I won't get involved here, but that actually, so what we are talking about there is what we call it as a B2C, business to consumer, that trying to enable the consumer. Alexa is another example that will try to do something for the consumer in trying to solve a problem of rather than me going to a website, logging in, do the search and things, trying to make the life easier of the consumers, right? So that's a B2C angle of it. But that particular topic, if I stay on that for a minute, the concept is called conversational AI,
Starting point is 00:03:32 and that's more than the Alex Selsman series or Googles of the world. There are some classic examples that we have worked in the past. I'll give you one example in a B2B scenario. You remember good old days that we call people up on the phone and wait for customer service for 30 minutes, right? So some of the issues that came up with that is that the customer service is overwhelmed by trying to solve whatever the problem you're having. It could be minor on how many hours you're open or what hours you're addressed, whatever that may be.
Starting point is 00:04:08 So by introducing this concept of conversational AI, could be a combination of either chatbot or even using the voice automated system talking to you or the whole nine yards. It could help you in solving whatever the problem you have. Some companies have obviously taken it to the next level. For example, Uber and Lyft, if you're trying to use things, you won't even find a phone number
Starting point is 00:04:32 to call their customer service. It's more about dealing with them all using chatbots or emails in combination there. That's all AI based, right? So the conversation with AI, what I'm trying to say, is a lot bigger than just your Siri and Google. There are a lot of ways it can help, that's one. And the second one is, this is a classic problem with some of the financial institutions particularly or regulated industries. When they receive hundreds of thousands of calls, they are expected or supposed to audit the calls that they receive.
Starting point is 00:05:07 Remember, when you call somebody, there's always a standard message saying that this call is being recorded. We will audit that, or we might audit that. And the reality of that is they cannot have enough manpower they can put to listen to all the calls. That's almost impossible. And when the budget comes in, that's a first one they cut. So what they could do, instead of putting in manual power, to listen to that and see if everything is closure. They could always have AI listen to those calls. And the problematic ones, the one person, two person
Starting point is 00:05:41 of the ones that need to be addressed, or out of compliance, or somebody promised that they are not supposed to. That could be flagged and then manual person can go and solve the problem. But overall what I'm trying to say is there's a difference between a research AI in which you are trying to do some kind of research and trying to make things better for the social good or whatnot. For example, COVID research is one thing. There may not even be monetary benefit, but the overall society cultural benefits that will come out of that
Starting point is 00:06:14 might be something for the greater good. But that doesn't mean that you should be doing that. There's a time and place for that. There are companies who are doing that. And then there are enterprise companies purely trying to solve enterprise problems, right? There's a laundry list of things that you could do using AI. And granted, AI is not at the level yet in which it could solve the customer problems directly.
Starting point is 00:06:37 There are some other things we could get into detail, but at least it has become what I call it as a cognitive assistance. It's assisting enterprises or at least like humans to take a look at making a decision. Okay, AI suggests that you should be doing this. And yes, I agree with that. So when you have both a machine and a man make a decision, not a woman make a decision, then it becomes a solid decision-making process without trusting one or the other, because otherwise you could always question, right? Yeah, I think that's really the key to this whole thing right there, is that essentially
Starting point is 00:07:23 the way that AI is getting real is due to a number of factors. I mean, certainly there's the practical factor that we figured out how to do machine learning and deep learning and, you know, things like that. I mean, that works. There's also the fact that I think that we've stopped looking at it sort of in a science fiction way to say this thing is going to be, you know, Skynet and it's going to come kill us all. I guess we still look at that in science fiction. But in enterprise, I think what we're looking at is exactly what you said, which is basically humans have limited bandwidth, limited pattern matching capabilities,
Starting point is 00:08:00 limited time, and AI can help us do what we do better. And to me, that's like the fundamental thing here. So if we're talking about, you know, ML ops or AI ops, whatever you want to call it, we're really talking about something that helps us do what we do more efficiently and do it better, right? Exactly. I mean, that's the key, right? Do what you do in a more efficient manner. For example, you mentioned about the fact that human limitations.
Starting point is 00:08:34 The human brain power is also a limitation. The problem is even when humans collaborate, it's a collection of individual brains that works together, but then it takes time to develop things. But when you put a machine on it, especially particularly when you have like a ton of volume of data situation, petabytes of data, terabytes of data even, when you put that in front of a computer, the computer takes a lot less time
Starting point is 00:09:01 to either look at the patterns or get insights or analyze or even build models based on that. Can that be done using human brains? Sure. I mean, there are companies still even today do the models using the good old tools, the BI, even some companies actually still use spreadsheet, believe it or not. And that still works fine. So it all depends on whether or not it's becoming a problem for you.
Starting point is 00:09:26 If it's beyond the capacity of your folks can solve, who are trying to do it now, then you might have to start thinking about some kind of assisting technologies. Yeah, I could be one. There are other finer insights technologies as well. Everybody is an AI vendor nowadays, of course. But you got to find the right thing to solve your problem.
Starting point is 00:09:51 That's actually another point I think that's important to make. And that's that we're calling this AI, utilizing AI. AI is bigger than machine learning. And I think that a lot of the time, especially today, people assume that ML equals AI and AI equals ML. And that's not entirely true. You know, I mean, I think that it's important to realize that many of the first successful artificial intelligence implementations were nothing more than expert systems, which had, you know, things kind of programmed into them. The most popular, by far the most popular AI-based toy and the most popular AI familiar to most people is that little 20 questions game. Essentially, you can do really remarkable things in using, you know, an expert system or a 20 questions kind of system,
Starting point is 00:10:46 that's not ML at all, but it is AI. And I think that we're gonna see increasingly that there are various forms that this technology is gonna take. I mean, already even in ML, there's various forms, right? I mean, I think that you've written a little bit about that as well, the fact that, not all ML is the same thing, right?
Starting point is 00:11:07 Yeah. So a couple of things on that. Talking about expert systems or using AI as an applicable AI, things could be pretty simple. I'll give you an example. I believe the name of the company is called Khan Academy. They provide free resources for kids to prepare for college and school and other things. And I have my kids, I basically force my kids to use that, right? Rather than wasting time on video games.
Starting point is 00:11:38 One of the things I found out, which was pretty interesting, was it doesn't, put it doesn't categorize you because you're a kid going to a tenth grade in this geography so you should get only this list of questions it doesn't put you in that category what it does is it bases things on your your ability to interact with that or at least what I found and there are other systems of course I would find systems doing the same thing as well. So what it does is the system actually learns about you, right? And then you say, okay, I asked you this three set of questions or five or 10. Based on your ability to answer, I realize you are far advanced or far lower
Starting point is 00:12:22 than the standard that I set up for the common public. So I either amp up your question, so I ask you something challenged. I mean, this is no different than, for example, you know, my kids when they were, my kids are about to go to college now, but when they were in school, you know, so they got placed into this advanced placement or whatever for a certain category of people. So even the regular schools, the teachers monitor the students based on their ability to execute. And then they place them in an advanced grouping. So you get moved to a different group to learn things that would challenge you. Otherwise, they'll get boring.
Starting point is 00:13:02 So you could use expert systems to do a similar thing. Don't ask the same exact questions or train people in the same way. Use different set of questions. But if you do that, then it also becomes an issue of sort of profiling as well. So if you test 10 people, if you give eight people same question, give one of them very advanced question and one of them very low advanced or lower level
Starting point is 00:13:33 than the normal questions, how do you justify that? That becomes an issue. So that's where the article that you and I were talking about, all things about transparency, ethical AI, security, privacy, all of those things come into picture as well, whether or not you're allowed to profile an individual to do that. And just one more quick thought to finish off on that. AI is more than ML, I agree, not only expert systems, but also this whole notion of that we just talked about the conversational AI, natural language processing, speech recognition, text to speak conversion, all of those. They don't necessarily fall under the ML category, but they are actually AI, right? So there are like broader categories, vision systems, speech systems, you know, and machine learning to find insight from data,
Starting point is 00:14:27 those kind of different categories. But they all, for whatever reason, got grouped under AI. Yeah. And I think that, thank you, I think that this is all basically things that we're going to be talking about in our future episodes. I mean, I think that everything we've mentioned here is something that we're going to be doing an episode on in the future. One more thing, actually, that I want to bring up too, and this goes to my background, is that AI also has implications for operations and infrastructure.
Starting point is 00:15:00 In other words, what we found is that for the most part, you need specialized hardware, maybe specialized architecture, you know, systems architecture in order to make this stuff work. And for that reason, AI actually is coupled with advances in cloud computing, in server computing and, you know, functions as a service. Also, most of us are aware of the implications of using GPUs instead of CPUs for AI processing. Even infrastructure choices. One of the things that I've been really interested in as more of an infrastructure guy is to see how systems architecture has changed and hyperconvergence and, you know, building scalable storage systems and high performance storage
Starting point is 00:15:51 systems, building, you know, shareable, you know, GPU infrastructure, all that kind of stuff has come into this whole conversation. And I think that that's another area that is important for our audience to think about not just what is and isn't AI and what can you do with AI, but sort of how can you do AI? Like, what makes sense? And so, you know, you can have, you know, software as a service. You could have functions as a service. You could have, you know, a private or a hybrid cloud infrastructure. You could have a hyper-converged infrastructure. You can have, frankly, AI accelerators. I mean, one of the things that we're seeing is CPUs with AI specific instructions built into them. You know, we saw Intel announced that, Apple has that, you know, and then there's the world of, you know, all these, you know, GPU companies. Of course,
Starting point is 00:16:42 NVIDIA is famous for it, but a lot of other GPU technology is being used for AI processing and ML processing specifically. All of these things, I think, go into the conversation as well. So if we're going to talk about how utilize means to make use of, how are we going to make use of technology? It's not just how do we apply it to the business, but it's also how does it work? Like, what do we need to do it? And so from my perspective, that's what I'd like to bring to the podcast. So, you know, how I see it is, you know, basically me, you know, kind of coming in from that perspective, like, how do we do this thing? You know, what do we need to implement? What do we need to buy? What makes it practical? And, you know, folks like you basically bringing in the higher level discussion of, you know, sort of what is it? What can it do? How can it help improve the business?
Starting point is 00:17:35 Right. And I'm really looking forward to that. about the components of you know what what's used to to do the AI systems some of that it says you said the GPU versus CPU and highly scalable structure and whatnot there's a moment to put all these things together that could cater to when it comes to AI there are two parts to it one is you know what you do to create what they call it as some models. So the model creation requires a really high power computing, right? And then there's model inference. After it's created, the model will go to wherever you deploy it to the edges mostly. And then it'll look at it saying that, you know what, you asked me
Starting point is 00:18:19 to find out when this happens and this happens kind of inference kind of thing, right? So the inference hardware or the performance of the computing doesn't have to be that big it could pretty much work anywhere if you architected properly or design it properly but for the model creation you need very high power computing so there's a field called HPC high performance computing has become quite popular awfully there are about between five to ten companies that are doing it called HPC, high performance computing, has become quite popular, awfully. There are about between five to 10 companies that are doing it.
Starting point is 00:18:49 Some of them are better than the others. I don't want to name names, but if you look them up, that become the basis for this high-powered model creations. And having said that, the HPC doesn't have to be done all in cloud, even though a lot of the implementations are offered only in public cloud or hosted environments. It could also be built in your private data center. I'll give you an example. So there's this big, huge bank, who shall remain unnamed. They do what they call a risk analysis of all of their business accounts almost every day, every single day.
Starting point is 00:19:33 And the amount of data, what they get to analyze the risk, maybe somewhat limited. At times it could be maybe megabytes or smaller amount, a few hundred megabytes per account. But the number of accounts is not small. We're talking about millions of accounts. And when you run risk analysis and recategorize them every night so they can make decisions on them, that's a ton of workload that they couldn't solve using their existing infrastructure, it takes almost it starts to run the end of the business day from five or six o'clock and run through almost next morning six seven o'clock the whole time it runs create that and if there's anything goes wrong in that that day shot so so instead of continuing to use that because they are being a bank and they are a little bit skeptical
Starting point is 00:20:22 about moving things to the cloud. They built private high performance cloud environment HPC environment using some of these components and that overnight run that almost 1112 hours. They brought it down to under an hour to do the exact same thing. Right. So it doesn't have to be in public cloud. Not everything has to be in public cloud. If for various reasons you want to do a private instance in a public cloud, your own instance, or build it in your private data centers, sure, it might be a little more expensive. But if that's what you absolutely want, that's a possibility that's available too. I'm not saying that you and I will go and build it for them, but I'm saying they should consider those options. And I really appreciate that, Andy, because that's actually another thing that I'd really like to get into the podcast as well as some of these more personal stories and, you know, kind of use case examples of how this technology is getting real. So how are companies using it? How can it be applied and how can it be, you know, bring a real business benefit. And I think that right there is really the goal of this whole, you know, this whole topic is, you know, what is AI? What is it not? How does it work, you know, productively? What can it do for the business? How do we do it? How do we deploy it? How do we use it? And what are the
Starting point is 00:21:45 benefits that companies are going to be getting out of it? And I think all of these go to this whole concept of utilizing AI. And I really hope that folks will be joining us for future episodes and seeing how it is that we're going to utilize this technology. How are we going to make AI real? So thank you very much for listening. Thank you, Andy, for joining me for this great conversation this morning. Those of you listening, please do subscribe and rate and review the show. You can find it on your favorite podcast stations as Utilizing AI. You can also find us online, utilizing-ai.com, utilizing underscore AI on Twitter, because Twitter doesn't do dashes. And, you know, you can contact us as well, if you'd like. Just contact email host at utilizing-ai.com, and we would love to hear
Starting point is 00:22:38 from you. Please do share this podcast and subscribe, and we look forward to having you join us for future episodes.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.