Utilizing Tech - Season 7: AI Data Infrastructure Presented by Solidigm - 05x21: The New Direction of Hardware Led by Edge Computing
Episode Date: September 25, 2023Edge is leading the hardware industry into a new era of innovation. In this episode of Utilizing Tech, Stephen Foskett, and co-hosts, Allyson Klein and Alistair Cooke, sit down to dissect this. In the... wake of Intel’s discontinuation of its NUC product line, a question that is in everyone’s mind is, what’s next. Intel, and many behemoths like it, have a proud legacy of knowing how to break a stalemate and preserve the churn, and even if that means stepping up and pulling the plug on an old product. It oxygenates the marketplace, welcoming new solutions and keeping the wheel of innovation moving. In the context of the emerging paradigm of edge, this change will likely propel the market towards a new breed of powerful, low-cost, pocket-size hardware that delivers breakthrough energy-efficiency and compute performance with little infrastructure. Hosts: Stephen Foskett: https://www.twitter.com/SFoskett Alastair Cooke: https://www.twitter.com/DemitasseNZ Allyson Klein: https://www.twitter.com/TechAllyson Follow Gestalt IT and Utilizing Tech Website: https://www.GestaltIT.com/ Utilizing Tech: https://www.UtilizingTech.com/ Twitter: https://www.twitter.com/GestaltIT Twitter: https://www.twitter.com/UtilizingTech LinkedIn: https://www.linkedin.com/company/Gestalt-IT Tags: #UtilizingEdge, #Edge, #NUC, @UtilizingTech, @GestaltIT, @SFoskett, @TechAllyson, @DemitasseNZ,
Transcript
Discussion (0)
Welcome to Utilizing Tech, the podcast about emerging technology from Gestalt IT.
This season of Utilizing Tech focuses on edge computing, which demands a new approach to compute, storage, networking, and more.
I'm your host, Stephen Foskett, organizer of Tech Field Day and publisher of Gestalt IT.
Joining me today, back as my co-hosts, are Alistair Cook and Alison Klein.
Welcome to the show.
Hey, Stephen. It's great to be back. Yeah, it's awesome to be here again with you, Stephen. And three hosts,
that's got to be a record, certainly for me being on the show. Be our guest. So yeah,
we're all hosts. Everybody's a host here. We're just like Westworld. So we wanted to kind of
revisit the discussions that we've been having this whole season. And that's why I thought that
it would be nice to have you all on here, because I think the two of you have been on the most
episodes of anyone this season of Utilizing Tech, focusing on Edge. So I wanted to kind of
give us all a chance to sort of revisit the different
themes that we've come up with and the things that we talked about. Now, for me, looking back on it,
when we started, I was really keen on understanding the sort of software deployment aspect of Edge,
like how are we going to deploy and manage software at the Edge?
But it very quickly took a turn.
And I think that we can,
let's get the elephant in the room.
The big turn that happened early on in the season
was Intel getting rid of the NUC business.
Now, we've got some resolution to that.
Intel has not really open sourced the design, but basically is encouraging partners to continue the NUC business.
They're working with Asus to explicitly take it over, but other companies are as well, like Simply NUC.
And of course, a lot of companies are coming to market with new hardware. Is the question about hardware a non-starter? Is it just solved?
I think that I can say this, having worked at Intel, I'll say this.
I think that the NUC is a beautiful device, but I think when there's a will in the industry, there's a way.
And we will find alternative platforms, either from those that are inspired by the NUC directly
or from competitors coming in with even better ideas.
So I think that from a standpoint of those far-edge implementations
that are relying on Nook,
I think we're in good shape to find other platforms
that will serve this market quite well.
Yeah, I think we have seen a heck of a lot of different vendors
coming up with things that look like a Nook but aren't a Nook.
And so in some ways the move by Intel was to inspire the market to move beyond PCs and mini ITX and get to a low cost solution that allows you to run a reasonable amount of compute without having a lot of infrastructure around it and they've achieved that so it really does uh help
a lot that's uh they brought the market on and now there's multiple choices of possible solutions to
this problem including some that have the same form factor but different processes yeah and i
think that that's that's really right um you know what you guys are saying that the we maybe we got
too in love with that little that little box i mean it's a lovable little box you know, what you guys are saying that we maybe we got too in love with that little,
that little box. I mean, it's a lovable little box, you know, you got to love it. But,
but truly, there may be better solutions. And so one of the things another topic that I'm seeing
on the hardware front is sort of the collision of the two approaches to deploying hardware at the edge. And I'm calling those the duck
versus the mini server. In other words, the disposable unit of compute, that's the duck,
do you see, where basically you throw a Jetson or a NUC-like device or even a Raspberry Pi or
something like that out there. And if something happens to it, I will put another one out there, you know, as opposed to the wait, wait, wait, let's have a server that has, you know, that's quiet, that is capable of dealing with dust or grease or whatever.
That is maybe fanless, that can dissipate heat well, that is ruggedized, that has some capabilities to it.
So it's going to cost more, but it's a sort of a better,
more reliable system long term. I wonder whether we're going to go in one direction or the other,
or whether those are both viable for different use cases. I think viability for different use
cases, there are always different problems to solve. And different problems require different
solutions. So I think there will be use cases where something very disposable is going to be a good choice.
But there will be other places where persistence of data, precious data stays at the edge,
or there's a requirement for more than you can really deliver with a duck.
I like the duck just as a term. Very nice. I think that this all gets down to workload, right? And how much flexibility of workloads that customers want. If they're wanting to deploy more things over time, that fanless server starts looking really attractive. But if they know their workload, they know that it can be well served by a duck, then duck it is.
So moving beyond hardware somewhat, but moving into a different area of hardware, there's no topic that has been more relevant in 2023.
And that is certainly the case here in utilizing Edge than AI. This is a topic that everyone is talking about everywhere
and is certainly relevant when it comes to the Edge.
I was at a ball game the other day
and you paid for your stuff by setting it in a box
and picking it up and walking away with it.
And it just sort of knew what you picked
and charged you the right amount
and didn't work. But theoretically, you know, charged me the right amount. You know, I think
that there are more and more applications like this that are going to be deployed, especially
at retail, but also in manufacturing, in all other areas at the edge that utilize AI. Hey,
that was the name of the first three seasons that utilize
AI in edge environments. So what's your take on the collision of AI and edge?
I was just at the AI hardware summit in Santa Clara, and it was really interesting to talk to
these silicon providers who had really bet big on AI at the edge and are re-vectoring
their portfolios to data center. So I think that while we look at pragmatic solutions being
widely deployed by enterprises and getting value from the edge is something that many
enterprises are enjoying, I don't think we've seen the broad proliferation of AI at the edge is something that many enterprises are enjoying. I don't think we've seen the broad
proliferation of AI at the edge yet. And I think the industry is responding by maybe pulling back
a little bit on technology specifically for that purpose. Yeah, I think there's some older
implementations of technology. And I think what stephen saw was probably uh something embedded using opencv which has been around for a long time uh to recognize what he'd
placed in the box to order and there's a long leap from that towards real ai where large amounts of
data a large corpus of data is used to train a model and to to make predictions based on data
you've got.
As Alison was saying, that's not really all the way out at the edge, right?
It's still, the data center is the place where you get a big payoff from big implementations.
The data center, the edge is one of those places where you continue to have that little incremental value.
So I think we will see more dedicated hardware in the Jetson and some of the implementations of the TensorFlow,
TensorFlow cores, the direction we'll see.
It'll be interesting to see those coupled up with some low-cost disposable compute hardware like the Raspberry Pi,
what we saw in the Jetson, where it was a small ARM core and quite a lot of offload
power. Those things, I think, are the future of where those disposable units of compute
are going to head. But then not every application wants AI as well. So I don't think we'll see
that exclusively. We're going to see more variation in what continues to be deployed
out in the hardware. And personally, as an old-time hardware person with a soldering iron sitting in front of me at the moment,
that's the stuff that I find really exciting and interesting.
Yet the real challenges continue to be
around the software distribution and management at scale of these things.
Yeah, with regard to, as you said, Tensor cores,
I think that that's a key consideration
because most of those disposable units,
especially the Pi, Jetson- type ones, are using smartphone chips.
And most smartphone chips are implementing Tensor cores now just sort of as part of the core.
So I think it's inevitable that the, I mean, I'm going to guess that the Raspberry Pi 5 will have Tensor cores built into it, right? I think that it's inevitable that the Jetson will continue to see traction,
that Sony is involved with the Raspberry Pi Foundation
and they're trying to put AI cores in there as well.
I heard that news story.
And then similarly, as we kind of move up that scale
to more modern servers,
I was just talking to Lenovo about their new line of servers
that can handle multiple full-size NVIDIA GPUs at the edge.
Now, that is pretty impressive stuff if you need a lot of power.
But we have to keep in mind, I think that people get confused.
They think about, oh, giant billion parameter ML models.
You're not doing that at the edge.
You're not training the model.
You're using the model. You're not training the model. You're using the model.
You're utilizing it.
So as our friends at ML Commons,
who joined us on previous seasons,
as our friends at ML Commons talked about with MLPerf,
if you look at the MLPerf inferencing tests,
you'll see that there's a whole range of hardware
that can support inferencing applications at the edge,
all the way from, yes, an H100, all the way down to a
mobile device, and even CPUs. That's the interesting thing. Intel showed that their Xeon CPUs are able
to run large language models effectively for some limited applications without a GPU at all.
And I think that that's, from a hardware perspective, again, the answer comes back to, well, it depends on what you're doing. It depends on what your workload is. And there's an answer for it. If you need a ton of GPU, go buy yourself one of these Dell or Lenovo or Supermicro servers with an NVIDIA card in it. If you need less, I don't know, maybe you look at a Gaudi 2, maybe you look at something with a Tensor Core, or maybe you just run it on the CPU and wait a little bit. Does that seem reasonable, Allison? I think so. I think that
we've known that inference runs well on CPU in a lot of applications for a while. When you were
describing that server, the thing that I thought about was, yes, they're going to power their
workload, but can they power the server at the edge? You know, that is going to
consume a tremendous amount of power to have that many GPUs in a box. And are we going to be looking
at power configurations and edge implementations like we do in data centers right now? That scares
me a little bit. Yeah. Well, then you have to worry about the environmentals because, you know,
if you don't want it to be a screaming fan and to suck down tons and tons of power, if you want it to be robust. And those are all things that these big server companies are really, really zooming in on because they realize they need to address power cooling, ruggedness, everything. I'm just imagining those multiple retail GPUs being shipped into the fertilizer factories that Ben Young and I have both worked at and sucking fertilizer dust through that then corrodes out and the lifespan becomes measured in weeks rather than years.
That's not a good edge answer.
So it definitely is different use cases for different places and the more we see low power
massively parallel which is essentially what the tensorflow and the gpus are about if we see low
power massively parallel systems that's that's going to be heading nicely in the direction for
getting to these more hostile locations and it's there's a whole lot of hostile locations we that
are edge they're not all friendly data center-like locations or even friendly retail locations that are edge. They're not all friendly data center-like locations, or even friendly retail locations that have air conditioning for humans. It's a lot of
places where the location is not fit for humans.
Yeah, absolutely. And we spend a lot of time talking about agriculture, ag tech at the
edge with, as you mentioned there. So another thing you brought up Alistair that came up this season and just now is software
and the Kubernetes question,
containerization versus virtualization.
These were key questions episode after episode.
Catch us up there.
What's your opinion on software at the edge?
Again, to take the consultant answer, it depends,
but that's always the beginning of the answer.
The answer continues on with what's the problem you're trying to solve and what's the right tool for it?
So if you're looking at a large scale environment, tools like Kubernetes are good for running populations of containers in a single location.
Now, there's a lot of work being done around managing multiple of those locations with Kubernetes but it's still sort of data center scale
numbers, 20s, 30s. So there's still work to be done there if you need Kubernetes
running out at hundreds of locations. I think I've always said the dirty secret
of the edge is that there's almost always a Windows machine sitting in a
corner that's been doing the same thing for the last 10 years. It has a huge amount of business process
knowledge written into the code that only runs on that Windows machine. And so
in the same way that there's this big uplift to move from on-premises into the
cloud, there can be this big uplift to move from that dedicated Windows-based
application to containers. And if there's no huge business value to it,
it simply isn't going to happen,
and so you're going to have to continue
to accommodate those Windows machines at the edge.
So for sort of more brownfield kind of edge
where you've had things previously,
you've been maybe running a retail organization
where you managed to upgrade from OS2 Warp
to Windows 3 or Windows 95, hopefully,
maybe even 2000.
Those things, you've got a lot of sunk knowledge in them.
But if you're starting from scratch,
you've never had IT out here.
And this is what we're seeing
in a lot of the very extreme edge locations.
We're getting IT where there's never been IT before,
and you can build from the ground up.
You know, it seems to be the natural place
to build with containers
and probably a lightweight orchestration
using something like just Docker Compose and controlling Docker Compose out to all of these edge locations.
I think we saw that quite a that the edge does not have the same
characteristics as the cloud in terms of failover and redundancy. And that while we're using some
of the same software tools and some of the same approaches, we really need to be thinking about
a single point of failure at those edge sites and how does software need to change in order to accommodate that? I think the industry is doing a good job at looking at, you know, how
do we morph orchestration solutions in order to address that? But I don't think, you know,
in talking with customers, I don't think that customers have really thought through that nuance of edge and being single point failure versus cloud being something that can fail over quite elegantly.
And so I think that's an area that I would like to keep tracking as we move forward.
So during this recording of this season, we attended VMware Explore.
VMware talked a lot about Edge. The Edge word was as much a buzzword,
well, not quite as much a buzzword as Gen AI, but it was right up there. And that came up earlier
in the season as well when we talked about the virtualization question and the need for
virtualization at Edge. Has your opinion changed on that throughout the season? I think that the virtualization discussion is something where I know that virtualization will always exist inside of enterprise.
It's kind of like any other core technology, that it will never die.
The question is, will new deployments utilize virtualization and for how
much longer? I think that the team at VMware did a great job at VMware Explorer in describing NSX,
which I think is one of their key values moving forward in the market. But, you know, I think that
some of the cloud service providers, hypervisor offerings are really making me question VMware's core viability in terms of holding on to that North Star of virtualization and having differentiation with their solutions.
I think that that's something that we're going to have to look at as the Broadcom acquisition continues and we see the changing strategy for that company holistically?
Yeah, I think one of the things is that VMware has talked about Edge
for a long time, but you need to return to what was their core value proposition
as they started, and it wasn't around the virtualization.
It's the ability to take an unmodified application, run it on a new platform.
And that was the magic of virtualization
where the magic of cloud is modifying your application.
So that's a much harder deal.
So I think VMware has, or virtualization,
not necessarily VMware
because they're not the only virtualization platform around,
has a play at the edge.
One of the interesting things was that VMware has walked
back a little bit about their running virtualization on top of ARM and heading it towards
more, this is what we're going to do for DPUs, rather than we're going to do this with embedded
hardware that's going to run standalone out at edge. So we're seeing less message from them around vSphere everywhere ESXi everywhere and
on every device and that changes how you you see them at the edge we do understand that their
VMware edge solution that sort of package solution for edge is is actually the Zededa product that
with with their own layers on top so for them to be relevant at edge, they really do need to be managing, as Alison said, heading
out into that NSX, heading out into managing the network and probably getting their SASE
offerings glued together into something that's really good for that large number of distributed
locations.
Because the network is king when you start pushing all of your compute out to the edge,
just as it is when you push all of your compute into the cloud.
The network becomes your most critical access, the most critical way of getting to your applications and getting value from them.
Yeah, I have to agree.
And I think that VMware sees networking, and rightly so, as a second line of business.
And it's not virtualization in the same way that vSphere is virtualization, but it really is.
I mean, everything they're doing,
as you mentioned from SASE to SD-WAN,
it's all virtualization of networking resources.
And I think that that's smart for them
and it's increasingly critical.
And I think VMware is being careful to
work with everybody in the industry to partner with them and not to compete with them with edge
solutions, because I think they realize that they've got a good hand to play in the edge.
And I think that it's going to work. Networking and security, what's your take, Allison? I think that when it comes to far edge and near edge, I think that one of the key differentiators are going to be those service providers that can deliver real value for network performance in that near edge.
I'm really intrigued. We had a conversation on one of the episodes, Stephen, about, you know,
what are, what is that near edge and what does it deliver in terms of a center for compute?
You don't get that without really performant network capability. And I think that that's an
area where I'm also looking at who are the providers that step forward and
really deliver differentiated services. I think that I've really got my eyes on the telcos having
a really good role there because they understand that distributed network connection and delivering
core capability without downtime. And I would like to see what they come up with. And I think we're
going to see that heading into next year with Mobile World Congress. I think that's going to
be a huge theme about how are they delivering those edge services and what kind of technologies
are underpinning that for them. I've always been interested in the edge perspective on AWS's wavelength zones, where you've got
AWS services sitting inside at the moment, just Verizon points of presence.
And addressing some of that latency issue that you get as you start moving to 5G networks,
where the actual throughput and speed of data movement becomes so much that getting back
to a cloud data center or
to an on-premises data center becomes untenable and that you need that compute out much closer
to the mobile devices and whether that's a human health mobile device or if it's a mobile a vehicle
that has the application on it so i think i agree with allison that the telcos and particularly the
mobile telcos have a real role to play in getting the middle phase of
edge closer. And I don't think that we'll see this in complete isolation because there is a
relationship between the compute that's at the far edge and the compute that's at the near edge.
Unless, of course, you're talking about content delivery networks and that's a whole other story
that's edge but not edge as we've been discussing it.
Yeah, but it is a topic that came up this season.
But I wonder if maybe, you know, who's to say what's out of scope?
But it just shows how incredibly large this topic is once you start looking at it, once you start examining it.
Now, one of the things that Allison brought up here as well is the timing here.
When are companies going to be deploying edge solutions?
So talk to us a little bit about that.
I think that one of the things that I think about with edge is I started working on edge when I was in the industry and we were having heady conversations daily about what are these grand use cases that are going to be happening at the edge,
data gravity, AI at the edge, et cetera, how much of that compute is going to be taken away from data centers to the edge. What we're seeing is really practical applications. Alistair mentioned
CDN. I think that's been a huge driver of edge implementation, especially during the pandemic
when everybody was on their couches watching Netflix. But you also see retail point of sale. You see, you know, industrial
applications that are simplifying management of industrial sites. I think that we haven't got to
the entire gamut of use cases yet. But, you know, we're seeing, you know, really good
implementations with NUC-like devices at the far edge. We're seeing people thinking about near edge
and how you consolidate data closer to the point of creation. I think that's normal for a technology
transition. And I think that what you could say is the edge is happening faster and
slower than anyone imagined, depending on that use case. And I think that it's going to be really
interesting to see how those use cases broaden over the next couple of years, where I think AI
is going to be the big driver of how do we get that inference done closer to the data.
But, you know, Alistair, I'm really interested in your opinion.
Well, I think it's a case of a very familiar story that the future is here.
It's just not evenly distributed.
That there are people doing incredible things that were getting you excited as sort of beginnings of ideas in the past years
and are actually out in real use. But there's also the innovator's dilemma that if you put out a
solution a few years ago, it's in place, it's working, that there's a lot more cost to replacing
that with something that works incrementally better. And so there's some challenges around getting from your past to that future. And it really is a difficult thing to move platforms to platforms.
As Stephen talked about a while ago, he did edge out at retail places where the edge was 15
different edges pushed into one little place because each edge was siloed. And the move from
that to having a unified platform that delivers the applications would
have been very painful.
Well, there's a change in the types of unified applications.
So trying to catch the wave of technology that's available, that delivers some value
within the business that justifies the investment and the change is what drives adoption.
And it's always behind what the visionaries see as the potential.
But it's really important that the visionaries see potential
because then the people who come and implement have some vision to start from.
So it is always thus, and innovation always works this way,
that it's taken up in places differently,
and its full potential
is never visible at the start. One of the topics kind of leading from that that I have been
noticing and I have been struggling to put into words is the question of sort of who's making
these decisions. And I think that what you guys are saying just now makes so much sense. And it
really kind of crystallizes it to me. It's all about the buyer. One of the reasons that Edge was
so fragmented was because it wasn't really a conventional IT system at all. It was basically
a collection of vendor solutions that were being deployed wherever they
needed to be deployed, because that's what the line of business or that particular aspect of
the business wanted. And so, for example, in my retail experience, the reason that we had
a dozen different systems, even with a dozen different backhauls sometimes, was because that
was a dozen different buyers, and each one was a different aspect of the company in that location.
And I think that this has become really crystal clear.
The more I talk about this, that it's a different world.
Another big, you know, we talk about what makes Edge, Edge, right?
There's multiple locations.
There isn't on-site, you know, on-site IT resources.
You know, you need to be more focused on, you know,
repeatable zero touch provisioning, all these things like that. Well, another big difference
is the question of basically, like, who's the customer here? So in the data center,
it's kind of a top-down decision. You know, companies, a big company will go to the CIO
at the organization, or maybe, you know, the chief of a different, you know, application area or
something. And they'll say, like, you know, hey, here's this big thing. You want to buy this big
thing? And then the boss says, yeah, we're buying a big thing. And it's a top down decision.
In the cloud, we've noticed a very different approach. In the cloud, it's very technologically
focused. And it's very what I call middle up. In other words, hey, I've got this amazing solution,
Google developed it, it lets me manage containers, we're doing this. In other words, hey, I've got this amazing solution. Google developed it.
It lets me manage containers. We're doing this. And the boss is like, Kuba, what? Okay, I guess
we're doing it. And so it's a middle up kind of decision. At the edge, it's sort of like,
you know, like over the wall. It's all driven by the lines of business and the people who are
doing things, the restaurant management group or the manufacturing group or whatever. I heard somebody say that sometimes these solutions,
especially at retail, are being selected and deployed by somebody whose primary job is walking
around with a taser and keeping people from shoplifting. And that's not a conventional IT
buyer. It's a very different person that's selecting and deploying these solutions. And that's not a conventional IT buyer. It's a very different person that's
selecting and deploying these solutions. And it's another reason that the edge is different,
even though it looks just the same. When you think about that, I think that you have to think
that customers at the end of the day are going to take this technology and deploy it where there is
real business value. Regardless of what the industry thinks about, whatever we come up with in our heads,
it may not necessarily be something that offers an opportunity for business to be propelled.
And sometimes those use cases are things that nobody ever thought about before.
Stephen, before we recorded, you said ChatGBT is a perfect example of a technology
that we never talked about before it came up.
I think that that's always going to be the case
where I love watching how adoption happens
and what creative people think about
to do with things like NUCs, with things like Kubernetes, with, you know,
with things like a server that's, you know, deployed in a manufacturing setting. You can,
you can see how the value of the workload and the value of the use case drives the deployment and
then, you know, in turn drives future industry innovation. It's an exciting thing.
And I think that change in buyer is also something about maturity. We saw the first
buyer for VMware's virtualization was some business unit who had to solve a little problem
and progressively it became this is a platform for the entire organization. And that's the kind
of thing that I think is going on with Edge is we're moving from these point solutions used by the guy
with the taser towards the the central organization saying we need to commit to
a platform here and then that's going to enable that the guy with the taser to
choose just a little bit that's needed for his part of the role and maybe the
the advertising manager will be looking for a different component
to sit on top of that same platform that satisfies whatever her requirements are.
So I think there's a maturity thing as that buyer of the edge platform changes, but the actual
edge applications that run on top of those platforms will probably still be very business
unit driven. And I think for this reason, if I can be a little bit self-serving,
I think that what the industry needs, what the edge industry needs is information. It needs people
talking about solutions, talking about products, weighing the pros and cons, not hype. I mean,
that's the problem with a lot of this stuff, especially anything that's going to be sold and
profitable and everything, and especially AI. Oh my gosh, there's so much hype. What the industry needs is people just kind of approaching
things and looking at them and deciding what does this really mean? I mean, heck, we talked about
blockchain on the program and didn't pitch, you know, Bitcoin and investments and NFTs and stuff.
We talked about, you know, is it practical? What does it make sense for? And it's the same with this stuff. We need information. We need to consider that. We need to share that with the
world. And I think that that's exactly what we're trying to do here on Utilizing Tech. I think that's
what we've tried to do all season long. That's what we're going to be doing at the Edge Field
Day event and at Gestalt IT all throughout the year. Thank you both for helping me to kind of
bring this season to
a close, to put a bookend on it. A phenomenal discussion that brought up kind of topics from
every episode this season and in a way that really made a lot of sense. So I really appreciate this
time. If people are thinking, man, I would love to continue this conversation with Allison or
Alistair, where can they find you? Where can they continue this
conversation? Allison, let's start with you. You can find me at thetecharena.net and at Tech
Allison on what was formerly known as Twitter or Allison Klein on LinkedIn. You can download my
Edge ebook that I've published just a couple months ago and at the beginning of this Edge series and read about different companies that are doing really interesting implementations on the Edge.
And you can find me, Alistair Cook, online if you search for Alistair Cook and put in something
technical, not something cryptsporting and you'll find me. But also on the platform formerly known as Twitter is DemitassNZ,
because Demitass.co.nz is my company website, and you can find my writings there,
and periodically in other places around the world. Occasionally, Gestalt IT asks me to write some
things. And you will, of course, see both Alison and I at future Tech Field Day events over the
time. And that's a hugely valuable resource. I'll save Stephen from telling at future Tech Field Day events over the time. And that's a hugely valuable
resource. I'll save Stephen from telling you that Tech Field Day is a great place to find lots of
independent thought about a variety of different technology topics, particularly a lot of data
center and networking technologies, which we've just discussed are really important. So Stephen,
where can we find you in the next few months?
Well, thank you, Alistair.
Thank you for asking.
Yeah, so I'm Stephen Foskett.
I will be hosting Edge Field Day here.
You can find out more about that at techfielday.com.
I'll also be hosting our weekly Gestalt IT News Rundown.
Just go to gestaltit.com for that.
We also have On-Premise IT, which is our Tuesday podcast.
And yes, we will be doing another season of Utilizing Tech.
We're still in the works on that.
So keep an eye on Gestalt IT for the announcement of Season 6, I think, of Utilizing Tech.
Can't wait to talk about that.
Thank you so much for this.
If you'd like to reach me personally, I'm S. Foskett on most channels,
including X, Twitter, and Mastodon, as well as on LinkedIn. So thank you so much for listening to
this season of Utilizing Edge, part of the Utilizing Tech podcast series. If you enjoyed
this discussion, please do subscribe. You'll find us in your favorite podcast application.
Please give us a review, give us a rating, let us know what you thought of the season. This podcast is brought to you by gestaltit.com, your home for IT coverage from across
the enterprise. For show notes and more episodes, though, head to the dedicated website, utilizing
tech.com, or find us on Twitter at utilizing tech, or on Mastodon. Thanks for listening,
and we will see you next season.