Semiconductor Insiders - Podcast EP344: An Overview of the Upcoming Sensors Converge Event with David Drain

Episode Date: April 27, 2026

Daniel is joined by David Drain, show director for Questex’s Sensors Converge and Broadband Nation Expo, where he leads strategy, content, and industry engagement for two of the company’s flagship... technology events. Prior to joining Questex, David spent more than 15 years with Networld Media Group, most recently as senior vice… Read More

Transcript
Discussion (0)
Starting point is 00:00:07 Hello, my name is Daniel Nenny, founder of SemaiWiki, the Open Forum for Semiconductor Professionals. Welcome to the Semiconductor Insiders podcast series. My guest today is David Drain, show director for QuestEx, Censors Converge, and Broadband Nation Expo, where he leads strategy, content, and industry engagement for two of the company's flagship technology events. Prior to joining Questex, David spent more than 15 years with NetWorld Media Group, most recently as vice president of events and managing director of the Interactive, customer experience association. Welcome to the podcast, David. Thanks for having me, Daniel. So the show is called sensors converge. What does converge really mean in today's context,
Starting point is 00:00:47 especially with AI and edge computing and sensing all coming together? Yeah, that's exactly it. It's it is all about the coming together of those technologies that used to live in silos, whether that was sensors or connectivity or compute, and now that includes AI. So what's changed is that intelligence is no longer centralized. So you've got sensing at the edge, AI models running locally, and systems interpreting real data in real time. And when you look at the scale of the opportunity, ID TechEx forecasts the global sensor market will surpass 200,
Starting point is 00:01:31 50 billion by 2036. So that really underscores why this convergence is happening now. Yeah, AI is quite a disruptor. So who is the core audience for sensors converge? And how has that audience evolved over the last few years? Sure. So our core audience is engineers and technical decision makers. So these are people who are designing and building products.
Starting point is 00:01:58 That hasn't really changed. but I think what's changed is the mix. We're seeing more AI ML engineers, more system architects, more people focused on the full system integration rather than individual components. So at this year's events, we're expecting about 5,000 attendees, about 200 exhibitors. We've got about 100 speakers.
Starting point is 00:02:22 So it really reflects, you know, that broader ecosystem coming together. That's a big conference. So what are the biggest technology or industry themes that are shaping this year's event? And, you know, where are you seeing the most real world momentum versus, you know, the hype that we normally see? Right. Yeah. I mean, AI is obviously front and center, but the shift is really towards practical deployment. So not just models, but how can you actually run AI on constrained hardware? You've got to balance power, latency, cost. Multi-sensor fusion is another big theme where you can combine different data types to create more reliable, contextual systems. One of the things I've heard is we don't really have an AI problem right now.
Starting point is 00:03:19 We have a systems integration problem. The models are getting good enough, but stitching it together, right? The sensors, the compute, the connectivity. and that power into something that works reliably and at scale, that's where most teams are struggling and where a lot of the innovation is happening. Interesting. So can you highlight a few standout speakers or sessions that really capture where the industry is headed? Yeah. You know, we've got a strong lineup this year. We've got a keynote presentations from Rob Watts, who's the principal AI architect at Intel and Omar Abed, who's the chief technology officer of
Starting point is 00:03:57 Vincent's, which is a TDK group company, and lots of others. You know, what's exciting is they represent both sides of the equations. You've got AI architecture and sensing innovation and how those are coming together. So I'd say more broadly, the program is very use case driven. So whether it's smart infrastructure or medical sensing or industrial automation, you're seeing real-world applications, not just theory. And what makes sensors converge different from other semiconductor or electronic events? You know, why should someone choose this event to attend?
Starting point is 00:04:36 Sure. Yeah, you know, a lot of the events focus on just one layer of the stack, so whether that's semiconductors or AI or IoT, and what makes sensors converge unique is that it brings the full ecosystem together. You can go from a men's sensor company, to an edge AI platform to a connectivity solution, all in one place. And like I said, more importantly,
Starting point is 00:05:01 it's not just theoretical. We were putting a big emphasis on practical hands-on learning. For example, we have a couple of workshops where attendees are actually working with technologies like Laura WAN or understanding how sensors connect and how that data feed systems, like others, focused on adaptive self-configuring sensor systems. So again, speaks to that system's challenge.
Starting point is 00:05:26 We're not just talking about the pieces. We're helping people understand how to make them work together in real world deployments. And that's been a big focus for us this year. Interesting. So looking ahead, how do you see the role of sensors evolving in enabling the next wave of intelligent systems and more importantly, AI applications?
Starting point is 00:05:47 Right. So we believe that sensors are becoming the foundation of intelligent systems. They're the bridge between the physical and digital worlds. So what's changing is that sensors are becoming smarter, more integrated, and tightly coupled with AI. So going forward, the real innovation will come from how sensor data is used, how it's, well, how it's fused,
Starting point is 00:06:14 how it's interpreted and acted on in real time. And that's what enables things like autonomy, predictive systems, and more responsive systems, and will respond. responsive environments. Great. It's nice to speak with you, David. Looking forward to seeing at the conference.
Starting point is 00:06:29 Thank you, Daniel. Appreciate the time. That concludes our podcast. Thank you all for listening and have a great day.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.