Utilizing Tech - Season 7: AI Data Infrastructure Presented by Solidigm - 14: Three Reasons AI is Getting Real with @WirelessBob

Episode Date: November 24, 2020

AI, machine learning, and neural networks are not new ideas. So what changed now? Over the last 5-6 years, advances in software, hardware, and scale have brought machine learning to the forefront, ena...bling new products and technologies. In this episode, Bob Friday, CTO of Mist, a Juniper Company, discusses the changes that have enabled his company and others to bring AI to the enterprise. We focus on four key questions: When did you start to see that AI was going to be a major force in the enterprise networking world? How have Google and other companies enabled AI through advances in open source software? How about hardware? There have been many advances over the last decade, from GP-GPUs to faster and lower-latency networking in storage. How has this enabled AI? How are large neural networks enabling AI in the enterprise? Episode Hosts and Guests Bob Friday, CTO of Mist, a Juniper Company. Find bob on Twitter at @WirelessBob. Stephen Foskett, Publisher of Gestalt IT and Organizer of Tech Field Day. Find Stephen’s writing at GestaltIT.com and on Twitter at @SFoskett. Andy Thurai, technology influencer and thought leader. Find Andy’s content at theFieldCTO.com and on Twitter at @AndyThurai. Date: 11/24/2020 Tags: @SFoskett, @AndyThurai, @WirelessBob, @JuniperNetworks

Transcript
Discussion (0)
Starting point is 00:00:00 Welcome to Utilizing AI, the podcast about enterprise applications for machine learning, deep learning, and other artificial intelligence topics. Each episode brings experts in enterprise infrastructure together to discuss applications of AI in today's data center. Today, we're discussing the progress of AI from the lab to reality. And we're very, very thrilled to have somebody who's been at the forefront of that working with us here today. And that's Bob Friday. Yeah, thank you, Stephen. You know what, you know, AI for networking is one of those topics that's dear to my heart. You know, my background was, I started a company called Aerospace back in the early 2000s. Cisco acquired that, was at Cisco for for eight or so years, and then I went off and
Starting point is 00:00:45 started this. And that was really all about cloud and AI. So AI is a topic dear to my heart, and I'm happy to be here today. And I am Andy Thurai, co-host of this podcast, founder and principal of thefieldcto.com, home of the Unbiased Emerging Technology Advisory Services. I'm Stephen Foskett, organizer of Tech Field Day and publisher of Gestalt IT. You can find me on Twitter at S. Foskett and of course here on the podcast. So AI, machine learning, neural networks, these are not new ideas.
Starting point is 00:01:14 What changed now? Why over the last five or six years have we had such this explosion of things? And Bob, when did you see that AI was gonna be a major force in the enterprise space you know Steven for me personally this started when I was at Cisco when I was talking to some very large enterprise customers and I would say there was really kind of a market transition and technology branches that really can't the catalyst for this AI stuff taking
Starting point is 00:01:40 off you know on the market side it was clear when I was at Cisco I heard some very big customers telling me hey Bob, Bob, what we really need is we need more reliable software. We've got to have things crashing less often. We need things to innovate faster. They really wanted to make sure that the network could keep up with their mobile digital projects. And the third thing was really around AI, is they really wanted to make sure that they could get end-to-end visibility. There was really a paradigm shift from managing a network element to really managing some end-to-end experience. You know, and that was really where AI started to come into its own. You know, and when you look on the technology side, you know, for me personally, it was like when I saw Watson play Jeopardy you know that was a key point for me you
Starting point is 00:02:25 know it's like I said you know if they can build something they can play Jeopardy at championship levels you know we should be able to build something that can really answer questions and manage networks on par with network domain experts and so that was the kind of the transitions I saw in the market I think from a you know a technology point of view, my masters back in my college days back 20, 30 years ago, I actually did neural networks for my masters. That's been around for a while. It was really around 2014, seems to be a key year when you talk to software engineers and data scientists that this whole data science AI stuff became real. I think it became real for a couple of key reasons.
Starting point is 00:03:11 One is the open source community. We saw Facebook, LinkedIn, Twitter, all these big companies were basically contributing open source code so that startups could actually start building on top of the shoulders of this. And the other key technology change was really around the cloud, right? Compute storage. 20 years ago, when I actually tried to do some of this AI ML stuff, you know, calling my master stuff, we really just didn't have the compute storage, right? There's only so much you could do on a one-U Lennox box, right? There's just not enough horsepower to do that. So right now today, if you want to do AI compute, you really don't have any worry about compute storage.
Starting point is 00:03:42 The only thing you have to really worry about is your Amazon deal, right? There is really no limit on how much money you can spend in Amazon in solving the problem. Yeah, not only spend on Amazon, but also probably on NVIDIA and other chips as well, right? I mean, it's funny you talk about that IBM Watson thing. I remember the good old days. I was with IBM those days. The machine, well, I mean, I don't want to minimize what they did,
Starting point is 00:04:08 which was a pretty phenomenal thing on those days. But it is more of, you know, NLP with decision-making algorithms, a combination thereof. There was no need for, you know, multilayered neural networks and whatnot. If you had to do that with a box that you need to roll in like a mainframe, that would be almost impossible. I think the other thing we saw in 2014 was really the data. Those neural networks, 20 years ago when I built that neural network,
Starting point is 00:04:36 you really just couldn't build anything big enough with enough data that could do something useful. It's when we finally got enough data and these neural networks got big enough that we actually started doing interesting things with them. I mean, the AI is nothing new, right? It's been around, what, 60s and 70s, 90s and 70s. The whole, the problem was it was all theoretical because the computers and the compute power
Starting point is 00:04:58 couldn't catch up with it until now. And in the last five, six years, with the advances in compute chips and software and the open source, a combination thereof, it has become really, really good. Yeah. And if you look, what's really changed is, you know, where we saw it first happen was in the, you know, the self-driving cars, right? That's when we saw these convolutional neural networks really started to change what we could do with image recognition. You know, we started to see that really make a big difference inside of the healthcare space, right? You know, we're starting to see AI and computer image really, you know, we're you know, we can actually build something
Starting point is 00:05:36 Does a better job than the doctors in terms of diagnosing these MRI and these x-rays Right. And I think what I found in the enterprise space, you know, the key thing I found there was really around these LSTM recurrent neural networks, you know, that's being used for speech, right? All that things you see with Alexa, you know, in our networking space, those are really starting to allow us to do some of these anomaly detection things with incredibly low false positives right yeah we're now we can actually detect anomalies that versus generate any noise right that's been a challenge in networking is we've always built systems that generate a lot
Starting point is 00:06:13 of events but most of that most of those events are noise you know the IT guys waking them up at 2 o'clock in the morning to let them know they have some AP down is not very useful. And that's, you know, from my perspective, really the key to what you've been doing at MIST is exactly that, you know, finding value in the noise and finding value and sort of assisting, you know, people with determining what's important and what's not. I mean, is that really how you see AI being used? Yeah, I mean, I see, you know, if you think about the average person, you know, whether it's cars or images,
Starting point is 00:06:53 really there is so much data and noise out there. You know, what we're really doing with AI is really trying to build these assistants that plow through tons of data, what they do manually already, right? You know, when someone calls up with a problem you know about their internet or their zoom connection or something you know there's really an i.t person there that has to manually go through tons of data to find what's causing the problem and they take them hours you know how complicated it
Starting point is 00:07:17 is you know and this is where ai is coming in without you know a concrete example is uh something they call mutual information, right? You know, so this is kind of allows, you know, if you're in a distribution center and you're having problems with your barcode readers, you've never been able to figure it out. You know, these are techniques that allow you to take some user device problem and correlate it with 50 different network features, right? And figure out that really your problem is somehow related to the operating system right it's some version of the operations cost problem you know that's the power of these AI machine learning techniques that we're seeing so let's turn back to your the points that you
Starting point is 00:07:56 made earlier then so basically one of the three one of the three things that you said that really enabled AI in the enterprise space was the advancement of open source software and things that these cloud service providers are doing that you said that really enabled AI in the enterprise space was the advancement of open source software and the things that these cloud service providers are doing. How do you feel that that's really accelerated this? I mean, you mentioned that you can buy as much as you need now. Yeah, if you think about it, right? And I will tell you anybody here who wants to become a data scientist can do it over
Starting point is 00:08:23 the weekend now. I mean, if you actually just go onto the internet right now and get a tutorial on how to build a model that can recognize numbers, right, you can have that up and running in probably two or three days, right? And that is because all the data is there. They have data sets out there now that you can use to train your model. You know, you can basically go get your own Jupyter Notebook, download the Python code, and you'll find that the number of lines you need
Starting point is 00:08:50 to train that model is about four or five lines of code once you get TensorFlow up and running. So that is the power of open source. We are building, most of these startups that are doing AI are building on the shoulders of Facebook, Google, LinkedIn, you know, that are really contributing the underlying technology we need to actually make this easy to use and consume right now. Yeah. So there are some, so many interesting points you made.
Starting point is 00:09:16 So going back to the, the finding the operational issues, that's huge. If you think about this I mean particularly with the siloed operations and multi cloud operations right one single event of a broken micro service and one cloud location can create I've seen on average it creates hundreds or even thousands of events and and how do you correlate all them do you know the first one is, you know, what you said, noise reduction. It could, you know, reduce the noise up to like 95% some of the AI platforms, right? And then correlate all these events.
Starting point is 00:09:52 If you have a thousand events, you don't create thousand tickets for it. You somehow, you know, even if you create, you somehow correlate them. Otherwise, your water rooms will be like months right so that those two are really really important things on using AI in the operational to make it a operational efficiency because most of the IT is spending more than half of their money in just operations if you can make it so much efficient so much automated for the combination will between AI between insights and between automation you could reduce that cost really, really to a very smaller amount and use that amount for innovation. Yeah, I mean, and this is why AI ops is now becoming
Starting point is 00:10:33 such a big key thing into the enterprise space. Because you look at, we're building on top of these cloud networks now, most of this AI stuff is going to cloud. And that cloud foundation is what's really bringing the speed of innovation, allows us to basically download, you know, innovate every week, right? We're basically updating code every week right now and building on top of that AI. You know, if you look at the paradigm shift in the customer support model, what's interesting now is
Starting point is 00:11:02 once we have the data in the cloud you know your vendor actually sees the same data that the customer sees right and that's the full benefit of getting that data in the cloud for the AI to work on that by itself solves so many problems for customers because now they have visibility on the data they never had before and then once we start to apply AI we're basically changing the customer support paradigm upside down, right? You know, the days of a customer actually having to complain about their network element being broken or gone, right? You know, the vendor, we know when that network element's broken, you know, the vendor himself can send a support ticket back to the customer. You know,
Starting point is 00:11:40 that's that proactive, yes, we know that your router is broken, please RMA on it. That's where we're starting to see customers actually start to let AI actually start to make those support tickets. The operational piece of once you gain the trust of your AI assistant, he actually becomes a member of your team. He becomes another member of your IT team. Right. Not only that, but also, you know, I've seen some of the AIOps platforms and IT tools creates the tickets automatically, automagically as they call it, and groups them together. And then it'll also assign to the right support person. So you don't have to worry about, you know, sitting in the queue for hours or even days.
Starting point is 00:12:25 If that particular problem needs to be solved by this person or a group of folks who has expertise in that, what's the point of, you know, jumping through the hoops of L1, L2 and all that? Just send it to the SME. If it's an easier fix, let it fix and move on. You know, that's where I'm starting to see customers,
Starting point is 00:12:40 you know, as they start to understand and adopt these AI assistants, you know, they're starting to make these AI systems kind of part of their i.t team right because it's really no difference between bringing on a new employee you know most i.t department i.t administrators aren't going to trust anybody with their network unless they get to know you right they really don't care whether it's an ai assistant or somebody. Somehow they have to gain that trust. You know, what I've seen with the AI stuff, they're starting to trust the AI stuff to start to find bad cables, right? You know, start to do simple tasks where, okay, I will trust you, you know, if you see a bad cable, don't bother to ask me.
Starting point is 00:13:18 Just issue the support ticket. You know, so that's the beginnings of where AI is starting to make operational difference inside of an IT department. The other one point you touched on, the CNN, which was pretty interesting. We did a project when I was at IBM for NASA. Your point about data is being there. This is a classic use case. You know, the NASA satellite, I wrote a piece on that. If you go to my LinkedIn profile, you'd be able to see it.
Starting point is 00:13:51 The sun, the UV rays, when it comes in, you got to measure that being in a satellite, and then you will make some decisions based on that, based on its atmospheric conditions. That thing broke, and they have an option of either send another satellite or figure out another way to fix it. The other way to fix it was use AI and extrapolate the data that is available for the last five years to figure out based on Sun how bright it is and what time of the day it is, figure out how much of the UV could have been. And surprisingly AI was so accurate that we were able to get up to 98 to 99 percent of accuracy of what the UV would have been. The data is already there, just not available or visible to the human eyes.
Starting point is 00:14:29 Yeah, and that's back to the noise comment that Stefan make, right? That's the power of these convolutional networks or these LSTM recurrent neural networks. They're able to sort through so much data and filter out the noise. When I look at the anomaly detection, that's really kind of an interesting case
Starting point is 00:14:47 where these LSTM networks, they're actually making multiple variable predictions. They're not predicting just the average of the stock price. They're actually predicting the variance. So these neural networks are getting to a point where you may have 10 20 30 features going in predicting 10 or 20 features coming out you know and that's why you know they're starting to be able to do things on par with humans you know and that's probably another
Starting point is 00:15:15 point i usually let people know it's like there's a subtle difference between ai and ml you know ml is basically an algorithm you do to solve a problem. AI is something where you're trying to build something on par with a human behavior, right? Whether it's driving a car, interpreting an x-ray or MRI, you know, in the networking space, you know, we're really trying to build something that, you know, can really manage a network on par with a human. That turns out to be a very complicated task, right? You know, the human brain is very good at seeing anomalies in data. You know, that's why these neural networks are getting very close to be able to do the same thing that human brain has been able to do in the past, whether it's
Starting point is 00:15:51 recognizing images or looking for anomalies in data patterns. So this is going to be driving, I think, a kind of a paradox though, because essentially we've already got too much data in like let's talk about the enterprise talk about enterprise networking we've already got too much data too much data for any person to look at ml allows us to build systems that can look at that data but the paradox is that the you know seductive appeal of having ml looking at this data and finding outcomes and finding value in it is that it's going to demand more data right I mean are we gonna start
Starting point is 00:16:29 seeing a situation where we're not actually reducing the you know the number of metrics or maybe even the number of flags raised to operators but in fact increasing them because we just are looking at more things yeah you know so you know the problem of data is a real problem in the enterprises. You know, and it's actually a problem that I've been working with, you know, we're the ONAG group right now. You know, how are enterprises going to deal with distributed data sources across all their different vendors right now?
Starting point is 00:16:58 And so I think we're really going to the paradigm of the days of trying to build gigantic data lakes. I really think what the industry is going to be moving towards is more of a virtualization. You know, we have your data across all these different places. Leave the data where it's at. Really start with the AI ops question you're trying to answer. And then only gather up the data you need to answer that question. So I think that's a problem we're going to, you know, we're going to have to see the industry address is, you know, how do you deal with the terabytes of data that are piling up across, you know, across all these different places? You know, the days, you know, you can't really afford to build gigantic data lakes. It'd be data, you know, there won't be a data ocean big enough to hold it all eventually.
Starting point is 00:17:39 So I have a related question on the related to the data. So, okay, so you have a structured unstructured big data sets that are, you know, data lakes and all of those things what he talked about, but then there is also a related problem of having too much of a streaming data as well between your all the IoT devices between the events between the streams between the Kafka's, there is so much of small amounts of data that also bombards the big data. So combination of big data and too much of small data. How do you keep your models up to date? Because every time when you go to infer the
Starting point is 00:18:18 information, the model kind of almost have become obsolete on a minute basis or even seconds basis. So how do you cope up with that when when you're creating fresh, keeping your models fresh? And I think there's two topics there. One is the how do you train models. The other topic is real-time data. You know in the definition of real-time is dependent on the eye of the beholder. You know if I'm driving down my car you know I need that image recognition to be working on milliseconds. That's a real-time problem of I need to identify objects in front of me faster than the eye can blink.
Starting point is 00:18:53 I would say in most network enterprise problems, it's more about there is real-time streaming. So you have to have data pipelines that can handle data coming in and stream. The models themselves typically don't change that fast. You know, like these anomaly detection models, you know, we may basically retrain the model once a week over the last three months of data to look for changes over seasonal changes, right? If we're looking for some sort of baseline data. So I think, you know, when we think about data on real time, you know, the good news is we have Storm, Flink, Spark, all these pipelines are now handled or built to handle data stream real time. The model themselves typically doesn't change. You don't train the model every minute,
Starting point is 00:19:37 typically. I get that. So there's a somewhat of a related question. You know, almost every startup now, if you look at it, either they are cloud startup or AI startup. So a lot of the startups are trying to solve specific enterprise problems. And then before you know it, whether it's, you know, Google or Apple or Facebook or someone else come, provide the better version of that and make it available as cheaper or even open-source free then enterprises are going to pile
Starting point is 00:20:09 on that they use you know for all this enterprise I mean the small companies that are starting up to solve some of the enterprise problems what is the future what is in it for them yeah if you look at Uber, right? Uber built on the shoulders of map technology and phone technology and built a great company out of that. I mean, I think that is Lego pieces for startups to build, solve real problems, right? You know, we're looking at Google that's building the APIs that allow us to do image detection, right? You know, I can handle Google, any image in the world, and it will try to tell me what's in that image and what that image is all about. You know, if you haven't done it, it's kind of an interesting thing. If you go to Google, they have an image API. You can take your picture, stick it into Google, and they will come back with everything they know
Starting point is 00:21:07 about you from that. It's kind of a video search, right? That's probably the next thing we're going to see happening is, you know, we're going to move from text search to video search. You hand a picture and you'll get an answer back on everything you know about that video. So I think when we look at startups, this is probably the best time for startups who want to do some sort of computer vision or text AI, you know, and it kind of builds right into that open source model, right? There's just another layer of technology available that no startup itself could really handle, right? I mean, startups can go off and try to recreate all the deep neural networks and technology needed for computer vision. You know, so we're really building on the likes of Facebook,
Starting point is 00:21:45 Google, LinkedIn, you know, helping the were really building on the likes of Facebook, Google, and LinkedIn, you know, helping the next generation of startups get off the ground. So one more thing that I'd like to hit from your initial conversation there, Bob, again, when I asked you kind of why now, one of the other things you really talked about was hardware, and, you know, specifically chips and lower latency networks and faster storage and all this kind of stuff. You know, how much of a driver is that for AI at this point? Or is that just more an enabler of more advanced applications?
Starting point is 00:22:12 I think it's an enabler. I mean, if you know, I think Andy mentioned about NVIDIA, right? I mean, the amazing thing now is if you want to go to AWS, right, you can actually, you know, rent GPU servers, right, and they've optimized those GPU servers now to do either text recognition or some sort of image recognition, you know, a thousand times faster, a thousand times more efficient, you know, so if I was doing a computer vision startup right now, you know, I could probably process a thousand times more images than I would be able to do a couple of years ago, right?
Starting point is 00:22:45 And that's what we're starting to see out in the industry right now. You're actually seeing a whole another generation of silicon vendors really look at trying to optimize the hardware needed to implement these convolutional neural networks, right? You know, trying to do an image, you know, and you're seeing it really in the,
Starting point is 00:23:02 especially in the automobile industry, right? There's several startups that are trying to solve that problem of, you know and you're seeing it really in the especially in the automobile industry right there's several startups that are trying to solve that problem of you know how do you do image recognition every millisecond right you know and that's just a you know that's just a ton of processing that needs to get done you know when you're processing you know video at 30 images a second right every every 30th of a second you need to come back with an answer is there a stop sign in the image somewhere so that's the that's the power of what we're seeing is the hardware we're seeing hardware kind of really get optimized for these neural network algorithms and I think even outside of
Starting point is 00:23:37 NVIDIA right I've seen several startups that basically claim that the you know there's even better architectures the GPUs for doing these convolutional neural networks. Yeah, no, that's great. So, you know, at the end of the day, my personal view is so far, a lot of those enterprises as well as startups have been trying to build things that are based on more of a technology solution
Starting point is 00:24:04 that they're trying to build, whether it's a neural network-based, algorithm-based, what have you. But then now I'm starting to see a lot of startups moving in to solve business problems, which is key, because none of those big companies, in my mind, would provide. For example, one of the things we built a little early on is safety, industrial safety, Looking at the people who are walking in and out and figuring out who's not wearing a helmet and then send alarms to that so that you can tie up with the insurance company saying that
Starting point is 00:24:33 they're 100% enforcing it so hence my insurance rate should be low. That kind of a, if you're able to look at the business problems and try to solve that using all the technology tools available because if you're going to be doing the improvement for technology and tools area or platform if you're going to be doing the improvement for technology and tools area or platform area, you'll always be doing an ad race,
Starting point is 00:24:49 catching up, trying to catch up with others. But if you solve a business problem, then when ads and tools and technology becomes available, you can feed that into and make a better solution out of that. That's my personal view. That's where startups should be concentrating in my view. Yeah, I would agree with you, Andy. I mean, if you look where this stuff is headed right now, I mean, you're just seeing amazing things as we see AI sneak up into our lives.
Starting point is 00:25:14 You know, in the construction industry, I'm starting to see that they're starting to use drones and AI, basically, to look for those female cracks. You know, all that stuff is sneaking into our lives. We're starting to see robots go up and down our grocery shelves looking for product placements. So, I mean, I would ultimately say, you know, AI is going to be kind of one of those things on par with TV, internet. It's going to be one of these technology transitions that's going to touch all segments of the industry. It's going to touch all segments of our lives. And it's probably one of these 20,
Starting point is 00:25:41 30 year things that we're just going to see increase over our lifetime and their kids' lifetimes going forward. Yeah, I find it hard to disagree with that. I think that's again and again in this podcast and the discussions that we've been having. It's been coming up just all the different areas that are going to be touched by AI, both inside and outside the enterprise computing space. So thank you very much. You know, Bob, I know that you've done a lot. You've got a lot of stuff going on,
Starting point is 00:26:06 a lot of irons in the fire. Where can people connect with you and follow your thoughts on enterprise AI and other topics? Yeah, so I mean, as I said, this is one of those topics dear to my heart. So anyone who was interested, you know, feel free to reach out to me on LinkedIn, Bob Friday, you can find me there.
Starting point is 00:26:21 Bob at miss.com also works. So feel free to drop me an email if you want to continue the discussion. So this is one of those things I'm always happy to chat about. Over a glass of wine. Yeah, sometime, sometime maybe soon. And Andy, how about you? Love a glass of wine. You can find me on LinkedIn and on Twitter at Andy Thorei or on thefieldcdo.com, of course, or the Utilizing AI podcast page. Thanks, Andy. And again, I'm Stephen Foskett.
Starting point is 00:26:52 You can find me at sfoskett on Twitter. And I would love to hear what you think of the progress of this little podcast of ours. Thank you for listening to the Utilizing AI podcast. If you enjoyed this discussion, please remember to subscribe, rate, and review the show on iTunes since that really does help. Honestly, it really does. And please do share the show with your friends.
Starting point is 00:27:12 The podcast was brought to you by gestaltit.com, your home for IT coverage across the enterprise. For show notes and more episodes, go to utilizing-ai.com or you can find us on Twitter at utilizing underscore AI. Thanks a lot for listening, and we'll see you next time.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.