Utilizing Tech - Season 7: AI Data Infrastructure Presented by Solidigm - 2x04: Talking To Business People About AI with Ken Grohe of Weka

Episode Date: January 26, 2021

Ken Grohe of Weka discusses various business use cases for AI-enabled applications with Chris Grundemann and Stephen Foskett. AI is coming into practical use right now in applications like autonomous ...vehicles, drug development and healthcare, and retail. High-performance scalable storage is necessary for many ML training applications, and can be key to advanced applications in life sciences and others with massive data sets. The Chief Data Officer, and data scientists in general, are the future of the business, and AI is enabling the growth of this field. Guests and Hosts Ken Grohe, President and Chief Revenue Officer of Weka. Find Ken on LinkedIn and Twitter as @LeverageGTM and learn more about Weka at Weka.IO Chris Grundemann a Gigaom Analyst and VP of Client Success at Myriad360. Connect with Chris on ChrisGrundemann.com on Twitter at @ChrisGrundemann Stephen Foskett, Publisher of Gestalt IT and Organizer of Tech Field Day. Find Stephen’s writing at GestaltIT.com and on Twitter at @SFoskett Date: 1/26/2021 Tags: @SFoskett, @ChrisGrundemann, @LeverageGTM, @WekaIO

Transcript
Discussion (0)
Starting point is 00:00:00 Welcome to Utilizing AI, the podcast about enterprise applications for machine learning, deep learning, and other artificial intelligence topics. Each episode brings in enterprise infrastructure experts to discuss applications of AI in today's data center. Today, we're discussing practical applications of AI across the business. Now, let's meet our guest, Ken Grohe. Hey, Stephen. Ken Grohe here. I'm President and Chief Revenue Officer for a company called Weka, right in the throngs of all AI. We supply to a lot of the Fortune 50 that are using it for practical applications for AI and love to
Starting point is 00:00:36 go through those use cases. And there's a small connection for both of us. I spent, I think it was almost 20 years in Ohio, loved my time there between Hudson Aurora and now over in Silicon Valley. And I can't wait to share some of those use cases and discussion all the way through it about AI because it's the coming trend. Hi, I'm Chris Grunemann, co-host on Utilizing AI. You can find me online at Chris Grunemann. And I'm Stephen Foskett, organizer of Tech Field Day
Starting point is 00:00:58 and publisher of Gestalt IT right here in lovely Hudson, Ohio, as you heard Ken mention. You can find me on Twitter at sfoskit. So, Ken, we've been talking on utilizing AI for the last six months or so about all sorts of different aspects of it, from enterprise applications that contain AI to enterprise AI that enables the business
Starting point is 00:01:19 and AI supporting infrastructure. Weka, as I know from my experience with Storage Field Day and as a background in storage, is building really an ultra high performance platform for applications like this. And I think that this has given you some unique exposure across the spectrum of AI applications. So maybe you can tell us just to start off with, what are the key areas that AI is coming to the business right now? Yeah, I won't say we supply to all those different ones, but we're seeing we're spending a lot of time with autonomous vehicles, specifically whether it be passenger vehicles, whether it be flights, believe it or not, there's actually transcontinental flights that are autonomous or semi-autonomous as part of it, helicopters and autonomous fleets. I know for the supply chain, you know, unfortunately, a lot of us are going to buy, I don't know, a suit jacket during COVID times. Everything seems to be a 42 regular. Well, I'm not a 42 regular. How do
Starting point is 00:02:22 you get all the way through that? So transportation, autonomous vehicles are a big mover. Retail, and we'll talk through some of those examples. I think Stephen, you and I were talking about one before, talk about that as it pertains to AI, but a lot of trends out there. A quick witty line that I'll steal from somebody else. If people aren't doing an AI initiative now, you're typically going to lose to someone who has one in development or actually implement it in real life. So hopefully able to get some tidbits that help through it, but thanks for the question. Yeah. So jumping in there, I mean, obviously I think when people think of AI, the autonomous vehicles, that's one of the things that's a common use case that kind of really pops off. But the retail seems a little bit more nuanced or not as often California now. And full disclosure, I grew up on the East Coast. I lived, I think it was from 1992 in Hudson. Loved it, up north part of Middleton Road. So you probably know where it is.
Starting point is 00:03:33 But 92 to 99 in Hudson. And then built our dream house, what I thought was our dream house in Aurora from 99 to, I think we moved down in 2017. I remember there was a grocery store. I think it was called Acme at the time. I think it still exists. It was like the center of town. It was such the center of town. I actually sold my old Mustang. I bought my old Mustang there and bought an old Volkswagen bug. There was like the town
Starting point is 00:03:51 center almost. But what we enjoy is here in Santa Cruz because everything's organic. And yes, you can make fun of the hummus and all the kale that they tend to sell. When I go to the local Whole Foods, it's actually acting as a pilot for an Amazon Go. I think Stephen off before told me that I think it was either Seattle or maybe one of the East Coast. He's actually been to an Amazon Go. But in the retail world, imagine a world where I don't know why this upsets me so much, but when I'm waiting in line to buy six items at a Whole Foods for $80, yes, that's about what the perfect buy is at a Whole Foods, but they're all organic. You know, when I go to do that, not only do I not wait in line anymore, I literally, as long as I
Starting point is 00:04:30 have an iPhone, sorry for those who are using non-iPhone categories, to my understanding, they don't have an Android app right now for that. But as long as you have an iPhone, you walk into the store, and as long as you're active on the Whole Foods app, I literally don't wait in line. I literally walk all the way through. So I gain, I tell you, waiting in line behind someone counting pennies. So the bill for $30 and 11 cents, that extra 17 seconds they take to count those pennies, I literally think I could have cured cancer during that period of time. I probably couldn't, but it feels like I could. So to answer your question, the walking in and walking out of what I would call an Amazon Go or a Whole Foods, I think is spot on for an everyday use case that makes sense on the retail environment.
Starting point is 00:05:07 And then we could talk about others. But for me, that's real world. And really, that's a block away from where I am right now. Chris, great question. Yeah, that's interesting. I'm really curious as to what part AI plays in that transaction. Right. So I understand where you walk into the store, you've got your phone in your pocket, you pick up your goods, you just walk out.
Starting point is 00:05:45 And obviously, there's got to be some RFID technology, there's some tracking, maybe Bluetooth or Wi-Fi from your cell phone. But what's the actual, you actually tracks it through its RFID technology. So that actually has the tracking as part of it. So you might say that's a little bit overkill. Why would you put a GPU in that environment? It's based on the size of the store and the actual traffic availability and the overall growth of the store that you do that. And they're betting for a long-term hold. And if you look at the business of Amazon, they invested huge to get our buying cycles in play. So you might say they might put a, sorry, I bought an A100 recently, and I think it was $130,000, even wholesale value. So you say, why would you put $130,000 relatively, not even a cluster, but a small GPU in a store like that? It's worth it for them to have the competitive advantage overall. So that's the
Starting point is 00:06:21 technology. For full disclosure, I don't work for NVIDIA, and I don't want to give away anything NDA from them, but I know for a fact they use GPU technology to make that happen. So in both retail and drug development and autonomous vehicles, especially, it seems that what's happening here is we're using AI to specifically to integrate a variety variety of sensors, right? So you've got, you know, cameras, you know, you've got all sorts of other, you know, RFID sensors and Wi Fi and Bluetooth tracking and things like that. You know, this has been a challenge, I think, for technology to figure out how do we take specifically video data, how do we take video camera data and turn that into something usable? And I know that that's, well, I think that that's like the core challenge of autonomous driving, especially for Tesla, who is 100% focused on camera data instead of other data types.
Starting point is 00:07:18 And I know that that's what's being used in retail as well. I mean, they have cameras trained on everybody and it's not like a person's watching it. I think Ken, what you're saying is that it's, it's like the, the AI model is watching that using this hardware. It is the AI model. I don't want to get too much into it, but yeah, we do supply to the autonomous vehicle community, especially the first movers. I think we talked about this before, but whether you're doing passenger vehicles or whether you're doing trucking, there's a lot of supply change issues that were out there. And there's autonomous vehicles. I think I can say this name, TuSimple is one of the ones that was a first mover for that. And there's two major air carriers, one of which has that already in production, the one in Europe as part of it.
Starting point is 00:07:57 But specifically, what happens is the modeling and the training, what a parallel file system, and we got to kind of talk about this as well, was one of the adjuncts of AI. I think some of your other guests will probably talk about the power of GPUs. And I believe that a million fold. Some of your other guests will talk about the power of throughput. And we do not make networks. I'm sure other vendors can talk about networks, but we are recipients of networks.
Starting point is 00:08:21 And we're seeing 400 gigabyte networks implemented today, not just in academia, like in real world environment. I liken it to, and I'm glad this is video, I liken it to an equilateral triangle. My geometry teacher would be so proud of me right now. I feel you need to have all sides, all angles equal. So 60, 60, 60, same length triangle. You need the GPUs or a cluster. You need the actual bandwidth ability as far as you can't do a 10 gig network necessarily to run AI, in my opinion, or at least not to yield the RI that you're looking for.
Starting point is 00:08:49 And I think you do need a parallel file system. What that's allowed you to do, and a quick example that you would say on the modeling, specifically in autonomous vehicles, we have seen studies where what used to take around two weeks for a training algorithm can get done in around four hours. That's the type of things that are out there. I forget the exact amount of capacity that's thrown off by an individual Tesla vehicle. I've heard anywhere up to a terabyte per day across all the different sensors. But to answer your question, it's not just throughput, it's the ability to paralyze the throughput so you can have multiple work paths, if that helps. Yeah, and I think that that actually leads directly into this question. So, you know, people might say, oh, well, Weka, you know, I mean, you're not an AI application, but, you know, you need to have this kind of storage underlying a lot of these applications, right? I mean, and, but the question would be, is this more for training or is this more for production? In the autonomous vehicles, it's more for training.
Starting point is 00:09:47 Let me give you a production real world example. So we'll eventually get to this as well. But UCLA is one that I know pretty well. My daughter goes to UCLA. It's one of the largest universities out there. I think it's the most highest rated and the most applied to public university in the United States. Glorious campus, by the way, if you have a chance to see it. One of the use cases I've seen is 200 scientists where the recipient, actually one person got a grant of let's call it 10 million dollars to work on earthworms. There's things going on besides COVID by the way right now, but to work on earthworms to find out how through their gestation period and their longevity, how the average human can last instead of average age of 78 can go to 82. They're trying to find ways to have longevity at the end of our life,
Starting point is 00:10:27 which is great. So imagine if you had those 200 scientists and you had a great GPU to run all through the work and you had a cryo and microscope, and that costs like a million dollars to run. So imagine if you had those 200 scientists and you have, you upgrade the network. So no longer is bandwidth in your way. But now you're like, wait a minute, you're taking those 200 scientists and they're each got to share their work in a serialized
Starting point is 00:10:48 fashion. So it'd be equivalent to me, you, Chris, and a few others all being scientists, but you only got to use this precious microscope between 1045 and 11 on Tuesdays. I think you're muting your value of the scientist. You want all the value all the time. You know, I'm not a scientist, but I assume you're doing AB test, AB variable test all day long to try to find out how genome sequencing in the crown microscope can reveal other epiphanies that are out there. So on the training side, I'm familiar with autonomous vehicles, but in the production side, what we've seen is on a parallel file system, the beautifulness is that now you have 200 scientists who have all the use of the science of the trial microscope all the time. It's a bad analogy, but I'll use it. It's almost like
Starting point is 00:11:31 if we were doing a project, Stephen, I know how hard you work on some of these flights that are out there. I've actually seen that firsthand. Imagine if you were doing a project and you had to get it done by the next day, would you use Microsoft Word for that? Heck no. Everyone's going to override everyone else's data. You have to use Google Docs. This is not a commercial for Google Docs, but my assertion in production, the missing piece is to move it from a training vehicle to a production vehicle. I'm sorry, a vehicle not meaning physical, but like in the case of life sciences, you need the parallel file system so everyone enjoys the benefits all the time and whether it
Starting point is 00:12:05 be COVID vaccine work COVID resiliency work or something is uh working with earthworms in the case of UCLA the parallel file system gets into a production environment which I think is what you're hinting at Stephen yeah that's really interesting and I think I mean not to oversimplify but you know the the deeper I dig into this and then the more we talk about things with various guests, you know, it really seems that, you know, like the primary use case for artificial intelligence and especially machine learning as a subset of artificial intelligence is really dealing with massive data sets, right? I mean, it's really about we have so much data available that there's no way people can pour through it. And so what we've got is machines that are analyzing that data. And that really seems to be the connection, right? This kind of big data requires AI
Starting point is 00:12:47 is a really kind of strong theme through this. And I wonder, you know, does that reflect what you understand? And, you know, or is that an oversimplification and missing the mark? No, let me give a quick tangible example. We'll probably talk about the vaccine later, but we deal with a certain customer that's out of England. It's called Genomics England. To my knowledge, you talk about big data, they're one of the biggest, I believe, out there. They're at 70 petabytes and they run a cold tier on premises with a company called Quantum around 68 petabytes. I don't work for Quantum, but just showing what they use. And then believe it or not,
Starting point is 00:13:30 they do another decoupled version out to AWS. So if you look at the amount of big data that's available, what they specifically do is Genomics England was founded with the premise of getting to 5 million genomes sequenced by the year 2023. They funded it. It was growing up. It was going from 28 petabytes up to 70. If you do the math, that's how many petabytes you need to actually run it. And then out came COVID.
Starting point is 00:13:56 Well, they pivoted pretty quickly and said, wait a minute, someone's going to have to do resiliency. We won't touch the vaccine. But as the vaccine comes out, how do you stratify who gets it first? Who, you know, the first responders, maybe some elderly demographic. That work is really important. And we talked about this before, the political environment, England was such that they would be respected by some of the globalizations. Actually, some of the vaccine work was actually done in Switzerland, but then actually stored in Switzerland, not done in Switzerland, but then actually put it out in another area. But if you look at it, the actual
Starting point is 00:14:21 ability to decide and discern who gets a vaccine to maximize quality of life, Chris, that was 70 petabytes. The vaccine work I'm very familiar with was actually around 10 petabytes. So you might say the race to vaccine was really important. It was. But the resiliency testing was actually sevenfold versus the actual vaccine race. Hopefully that helps to show the scope and the scale of large files, parallel file systems in the large AI projects, what they're working on today. So on that note, Ken, not everybody is really as
Starting point is 00:14:51 up to speed as you and me on storage. I think that people understand, they comprehend that, you know, a petabyte or many petabytes of storage, that's a lot, but I don't think they comprehend the fundamental challenge of delivering that kind of capacity with, you know, integration, but also with scalability and performance. Because, you know, I mean, we live in a time when you can go to your local, you know, store and buy yourself like a, you know, eight terabyte hard drive for under a couple, you know, under $200. So somebody might say, okay, so well, so I need 1000 of those. What's the big deal? How come this costs something? How come this is a technical challenge? You know, why don't I just buy a bunch of hard drives? That's not how storage
Starting point is 00:15:35 works. And, you know, maybe we can take a little bit of a side here and just talk about this whole concept, as you mentioned, of scalable storage as a way, because I think if we can all understand that, you know, you've got video data, you've got life sciences data, you've got all the sensor data, you know, we're talking about petabytes of data that need to be made available in order to do machine learning, processing, you know, and even machine learning in production, you know, but how do we actually do that? What's the challenge to that? Yeah, the data mobility is going to be a hot trend more and more. Sorry, a little pun intended, but the money-making part, and again, sorry, we're talking more about
Starting point is 00:16:19 genomic singling recently. That's really not money-making, that's life-saving or quality of life improving, if that's the right way. That three petabytes money-making. That's life-saving or quality of life improving, if that's the right way. That three petabytes, I think people can grab that number around. Three petabytes, that's where everything's going through. But you might have recency of data. That's where you're making the information. But you need a place to put, because no one throws away data. I mean, Chris, none of your guests, Steve, none of your guests are going to say, hey, I have a way to dispose data. Like cycle management is very important, but the ability to move using a global namespace. So you start at the beginning and you call it whatever you call it.
Starting point is 00:16:51 If you can move it and it stays in the same global namespace and the same parameters, the same naming convention, and it goes to what I would call cold tier. Sorry not to get really technical, but that's really what it is. It goes to an area that you don't want to lose it. You might recall it quickly. And if you know the video world, that's like, hey, last year's Oscars. I want to bring them up quickly. Don't put them in the back area. Don't get it manual. And then putting it out to the cloud as well. Anything that you give to a single global namespace that's a parallel file system, you get the economies of ease of management. Let me tell you briefly, and it's going to be a little bit touting our place, but it wasn't just the size that mandated genomic singling. It was the ability to manage the storage. So using conventional storage, I'd call it non-modern Chris, believe it or not, they needed 10 storage administrators to manage it. That's just too much wasted time and cycles. You couldn't be as nimble
Starting point is 00:17:49 as you wanted to. There was no auto tuning or auto tearing or data mobility. When they moved up to a modern parallel file system, one that really had no compromises, they went to 70 petabytes. You'll love hearing this. You took away the drudgery for nine of those folks. So literally, they redeployed the nine people that were managing the storage and they went onto the app development side or DevOps side. Both jobs were jobs,
Starting point is 00:18:13 but those nine other people actually were able to get into more money-making and more in a quality of life type environments. They're down to one storage administrator for the 70 petabytes. So it's around data mobility and whether you're on premises, whether you're hot or cold cold or in the cloud using an S3 interface, for example, that's the benefit. You want to be able to do more with less. That's a reoccurring theme, no matter what podcast
Starting point is 00:18:34 you listen to, but that helps. And it also gives to a rise to a position that we hope we have time to talk about called a CDO, chief data officer. That is, if you're just getting in this business, that's going to have a two decade run where people are just somewhere in between the data scientist and the rigor of IT. Well, we could probably talk about that a little bit more, but the chief data officers, that role is rising tremendously. Yeah, that definitely is interesting. And kind of the approach to, you know, data analysis and data scientists, you know, I wonder, I mean, obviously, you know, you kind of compared it to, you know, the CDO being the bridge between data scientists and IT. Obviously, IT is something that, you know, maybe, I guess now it's probably 20, 30 years ago,
Starting point is 00:19:16 companies didn't have in-house. And now, you know, you wouldn't think of a company of any size without an IT department to focus on these things. It sounds like you see that as something that's a coming requirement for every company. Is that an overstatement or when you think the data scientists are going to be everywhere, or is this something that's a specialized function that some companies will focus on? Well, at the top of the show, I talked about, if you're not doing AI project, you're losing someone who does that. That's a factual statement. And I'm sorry,
Starting point is 00:19:43 I'm a little bit of a history major as far as IT. So I remember in the mid 90s, coming out of a financial crisis in the early 90s, ERP was the big thing. And everyone ran towards SAP R3. And if you didn't do ERP then, and sorry, I lived in Cleveland during that time. And during that Rust Belt,
Starting point is 00:19:59 everyone went ARP and SAP just flew off the shelf because it tend to be able to put everything into one view of the data, which is great. And then you look at some other killer apps. I was Aurora High School. I remember for a period of time, people were doing app server, app server, app server. It was insane. Why do you have five storage admins at Aurora High School in Ohio when you're doing nothing more than payroll and administration tasks? You need VMware really bad. So VMware became a killer app that everyone rallied around. Sorry, one last history lesson.
Starting point is 00:20:27 Office 55 was like the first cloud app that really resonated. And I think, and we're using Zoom today, I think Zoom took off more because people felt comfortable using the cloud because Microsoft invested in Office 55 and it became an app where you didn't have to do all these upgrades.
Starting point is 00:20:40 I think AI is that new frontier. I think it's coming out of COVID, companies have been stockpiling cash to figure out where the end is in sight in COVID. And as we go to a new normal and where it's more remote management and remote work, for example, you're in El Paso, Stephen's out of Hudson, I'm out of Santa Cruz, California, and we're all collaborating on what I think is a pretty good podcast. But my point is you come out of there, your competitive edge is going to be AI. And I think more and more people are going to invest in those types of things. And that's the inflection point. And I have, for those who are listening today, I actually was involved with a small nonprofit in Oklahoma that got it going ready for this with only five notes and 64 terabytes. And they're running an AI project today. So it's not just for the big folks, it's for the little. And to answer your question precisely, yes, I think the chief data officer will be there, not just in the Fortune 2000, but everywhere, because someone has to be the connection between the two. And I've seen a huge rush
Starting point is 00:21:31 on hiring folks at all ages, not just the PhDs at the Ivies or MIT or Stanford, but really, really brilliant people that were data scientists from the abstract world being pilfered by the, I'll call it the Russell 2000 and the Global 2000 saying, please lead our project, show us what's possible, retail-wise, AI-wise, and then we'll put the rigor of AI together. And I think, you know, a few years ago, I was a VP of sales. Now I'm a CRO. A few years ago, MIT directors became CIOs. You'll see that same inflection point. And I think the CDO is the rise of the role in the future. But I think it's a reality.
Starting point is 00:22:07 And I think, yeah, I would think nonprofits, certainly, in the commercial world, it's a competitive weapon for them, Chris. Great question. And that was something that came up actually last week on the Utilizing AI podcast as well. The fact that companies are going to have to start recruiting people like this. And schools are going to have to start recruiting people like this, and schools are going to have to start producing people like this. And I was mentioning that, in my experience, this is something I think that younger people are actually being trained in more than the, I don't know, old people like us. I mean, it seems like people coming out of college
Starting point is 00:22:40 are more aware of the challenges of data applications, and artificial intelligence and machine learning. And that's just sort of the new CS grad. I don't know if you're seeing that, Ken, just in terms of applicants to the company or who you're working with at these clients, is there an age factor or a generational factor in terms of IT folks? Yeah, yes, I'd like to tell you that was there is no ageism when it comes to this. But yeah, we're seeing a lot of folks in their 20s to 40s that are coming up, I'll make it near and dear to Northeast Ohio coming out of case in CSU,. Great schools where, you know, instead of them
Starting point is 00:23:26 going to work for a Tableau type company, instead they're doing something bigger. No, no, just like to Tableau. And I know they've been acquired recently, but instead doing AI projects in big data. My daughter's at Boston college. Three of her six roommates are big data majors. That was not a major a few years ago. All of them are coming to work for major web properties here in Silicon Valley. So they'll get used to the change of point. But yes, I think more and more, well, I would hearken it to, well, look at the skyline of San Francisco. Not to go off too much off topic. Six years ago, San Francisco looked like nothing what is today. Mark Benioff dreamed of investing in a certain persona.
Starting point is 00:24:05 It was called a CRM admin. It was somebody who sat by the bathroom or the elevator and managed an Oracle database that was called Salesforce. And that person was kind of a back office position and bluntly was making less than six figures. That person today is now making three times that amount of money. And it's the one person as the president
Starting point is 00:24:25 of the company I spend my most time with. Right before a board meeting, I'm not talking to my wife. I'm not talking to my three beautiful daughters. I'm talking to my Salesforce admin, getting the slides perfect for the board meeting and make sure the data is robust, almost to the point a few seconds before. Look what Mark Benioff did. He invested in the persona. He made them trailblazers. Remember the old Mark Benioff i'm not i'm sorry the mark zuckerberg type hoodie that he made him trailblazers he invested in the persona and the job now pays three times more than it did before it's the same function he's doing oracle database through salesforce as far as administration i think the cdo is that function of the future where people okay maybe jensen nvidia is the person who makes
Starting point is 00:25:04 that happen maybe they don't give out hoodies and maybe don't call it trailblazers, but the CDO is the profession because without them, you don't merge the pragmatism and the security of IT and what's possible, the academia of actual data scientists. And I think that's an incredible role. But yes, to answer your question, we are seeing it. And unfortunately, it's not people that are 53 years old. It's somewhat south of that, unfortunately. Yeah. So as you're talking through that, I think this is really, really interesting. Where my head goes is, as we're kind of comparing data scientists and data management in general to other IT functions,
Starting point is 00:25:40 one of the things I'm thinking about is the transition I've seen lately, you know, which kind of started with public cloud and has morphed in a lot of other ways where more and more of many companies' IT is actually outsourced, right? Because public cloud is really just outsourcing data center infrastructure in a way, right? And we're seeing a lot of, you know, managed service companies, and it seems like that pendulum is swinging that way, even to the point where, you know, we now see a lot of folks on the market with these kind of low code, no code type offerings where someone can actually, you know, it's not even just IT, it's actually the developers, right?
Starting point is 00:26:14 So I can actually, as a business leader, I can sit down and kind of express my intent and then create an application that does what I want. And so I wonder, you know, how much folks really need to invest in, you know, building up their own skill set around AI, or how much we'll see those kind of tools where, you know, can I as just a, you know, division leader or a GM tap into the resources of AI and data through kind of a low code, no code type interface, and be able to put this to play
Starting point is 00:26:44 without necessarily hiring a bunch of data scientists. I mean, is that, are we there yet? Is that something that's going to kind of leapfrog from IT or is there really a huge skill gap still? Chris, I have not found as a service, the AI talent pool. And then let me be clear, some of these companies are doing this,
Starting point is 00:27:03 especially universities to gain market share. Because if you think about it, some of the IVs and semi-IVs, you know, I'll use Stanford, MIT and others out there. They're doing AI projects of the multi-petabytes to gain that foothold, to get that notoriety so they can get the people to come here. Because to be clear, in a COVID world, anything besides remote learning is vulnerable. So they're using the competitive edge as well. But to answer your question in the COVID world, anything besides remote learning is vulnerable. So they're using the competitive edge as well. But to answer your question in the commercial world, yeah, you're beginning to get, sorry, I'd call them younger people that are more astute with AI and APIs and no-code environments and the integrations of your existing legacy applications to the AI environment. Because remember, AI is not your general ledger.
Starting point is 00:27:46 It's not your ERP. It's not even your data warehouse. It's a new initiative that allows it, but you have to have those feeds built in the backend. So I think for the existing IT folks, if they get very familiar with all the open APIs and have that as a skillset as part of it, that's a way to retrofit their career.
Starting point is 00:28:01 But I think eventually, I haven't found, and I'm a customer of this. So if there is a service out there that lets me do AI as a service, and I can rent somebody for X thousands of dollars for two months, I'd be a buyer of that. I have not found that. I have to hire somebody who's a recent grad or a semi-recent grad, or maybe have deployed it once or twice somewhere else. And then brutally I'll ask, what is the name of that company at? And I can go off and find it for myself. But I have not found that to be a service. I found that actually is a hire. And that person is, well, we've been hiring CTOs forever that are kind of dreamers.
Starting point is 00:28:34 Put the dreamers out there, and then it's a way to retain talent and to get a bit of edge. But great question, Chris. Very astute. Well, Ken, before we wrap up here, any last points that you'd like to make about, you know, sort of these various practical applications? Again, as a reminder, you know, you mentioned, you know, AI as a service to autonomous vehicles and, you know, manufacturing life sciences, you know, as well as retail. You know, where do you see this going? And then we'll move on to our lightning round questions. Well, perfect. I can't look for the lightning round, but not to look past it. Yeah, the autonomous vehicles are great. Yeah, I look forward to when we get back to commutes and we're not doing commutes just yet. At least I'm not.
Starting point is 00:29:18 When we get back to commutes, I look forward to multitasking during that commute and getting to full autonomous driving. We didn't talk that much about it, but we are a supplier to some of the larger credit card companies that do making sure there's not false positives. Example, I have allergies. When I go to buy a certain allergy medicine at CVS, and for some reason they're out of stock, and I go to three different CVSs in a five-mile drive, I don't want them shutting down my credit card. So that's one of those adjuncts of AI. So we didn't talk about that, but that is one of the everyday use cases, making sure for the UI experience is part of it. But then we didn't talk that much about the actual vaccine, but we've been involved with that at this company, which is great. But you think about that's a race. And then we did talk a lot about resiliency. And I'm thankful that we were able to talk publicly about Genomics England. But where are we headed to? I hope everyone's able to go in and out of a grocery store and buy their incredible produce and not have to wait to wait in line. And I'm hoping to be able to, on their commute, when we get back to commute,
Starting point is 00:30:06 you drive up 480 and to downtown Cleveland or Independence where I worked at Rockside Road for many years. You don't have to put your hands on the wheel. So I do expect that. And I do think you heard it here. Well, Steven, you probably broke the news or Chris, you broke the news.
Starting point is 00:30:18 CDO, Chief Data Officer. Remember you heard it here. That is where if you're a 24 year old getting a CSU or getting at a case or wherever you went to school or Ohio State, please consider that as a career. There's a vicious need for that and we all can benefit from it as a civilization. Great. Well, Ken, so this is a new feature in season two of Utilizing AI. I'm just going to ask you a couple questions and I just want your honest, quick answer. Give me 30 seconds back and on these questions. And again, listeners,
Starting point is 00:30:49 he hasn't been warned about these things. So we're going to get some fun off the cuff answers here. So question number one, is machine learning a product or is it a feature? Machine learning is a feature. You mentioned autonomous vehicles. So here's the question. When will we see a full self-driving car that can drive anywhere, anytime in production? I'm privy to some stuff on that. I will tell you it will be 2022.
Starting point is 00:31:24 And then finally, a new question here that we've never answered before. When will we have basically machine learning applications in the home that use video the way that Siri and Alexa use audio? I'm going to answer that question. I'll go a bit longer on that. I can't answer that specifically. I'm not too close to that. What I am kind of close to is if you look at like the Oculus, the virtual reality, I will tell you some of the web properties are jealous of the success of this whole Zoom generation. If you think about it, if you're a web property, wouldn't you want to change the game so we're not sitting in front of our Apple computer doing Zoom? Wouldn't you want to change the game so we're not sitting in front of our Apple computer doing Zoom? Wouldn't you want to put an Oculus and make it a better experience on your eyes and have, I remember John Chambers, what a great guy at Cisco, he talked about telepresence. What if you could change the game and instead of, you know, I'm not talking about Google Glass, but something wearable, some type of wearable technology or an Oculus type glasses and you do full immersion and the meetings go with you, not your nearest desktop.
Starting point is 00:32:29 And that kind of changes the game. There are some web properties trying to do some type of virtual reality. So the meetings go to where you're walking versus where you sit down in front of a desk to do it out there. I think that's possible. I won't put a year out there, Stephen, but I'm not the right person to answer it on the Alexa and the voice recognition because I was involved with the creation of it, but I don't know exactly where that's going just yet. Sorry about not having a great answer on that one. Well, that's a pretty good answer. And that's the whole point of this part. It's supposed to be fun, off the cuff, catch people by surprise. So Ken, thank you so much for this. Where can people connect with you and learn more about what you're doing and maybe continue the conversation? Well, I'm on LinkedIn. I post quite often, especially on Friday,
Starting point is 00:33:05 something called Lunchtime Friday. I try to pay it forward because I've been a blessed guy with a lot of good breaks that look out for me. So I put that out on LinkedIn. It's grohe, G-R-O-H-E at weka.io, W-E-K-A.io. And I'd love to hear from you. I know as far as an email or in social media,
Starting point is 00:33:23 I'm on Twitter as well. I think it's leverage, go to market. And I would love to hear from you. I know as far as an email or in social media, I'm on Twitter as well. I think it's leverage go to market. And I would love to hear from you, whether it be in social or whether it be an email. Love to hear your feedback and how we can make the product and overall AI a better experience for the human race. How about you, Chris?
Starting point is 00:33:37 Anything you're working on these days? Quite a few things, but you can find all of it on my website, chrisconeman.com or find me on Twitter at Chris Coneman. And that'll branch out to all the different things I'm working on. And as I mentioned, I'm Stephen Foskett. You can find me on Twitter at sfoskett. You can also find the Utilizing AI podcast on Twitter. And one thing I'll point out that I'm pretty excited about is we recently did our first AI Field Day event. So if you go to techfieldday.com,
Starting point is 00:34:05 you can click on the AI Field Day logo. The AI logo is in the top and see some presentations by some great companies who are working on all sorts of different areas of AI. So that's a lot of fun for us. And thank you everyone for joining us for the Utilizing AI podcast. Again, we publish these every week on Tuesdays at 9 a.m. Eastern time. If you enjoyed this discussion, please do remember to subscribe, rate, and review the show on iTunes. And please do share it with your friends. This podcast is brought to you by gestaltit.com, your home for IT coverage across the enterprise. For show notes and more episodes, go to utilizing-ai.com or find us on Twitter at utilizing underscore AI. Thanks, and we'll see you next time.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.