Semiconductor Insiders - Podcast EP271: A Tour of SiLC’s LiDAR Technology and Its Broad Impact with Mehdi Asghari

Episode Date: January 24, 2025

Dan is joined by SiLC Technologies CEO Mehdi Asghari. Mehdi is one of the early pioneers of the silicon photonics industry with over 25 years of commercialization experience in this space. SiLC is his... third Silicon Photonics start-up focusing on advanced imaging solutions. His previous startups, Kotura and Bookham, both enabled… Read More

Transcript
Discussion (0)
Starting point is 00:00:00 Hello, my name is Daniel Nenny, founder of SemiWiki, the open forum for semiconductor professionals. Welcome to the Semiconductor Insiders podcast series. My guest today is Silk Technologies CEO, Mehdi Ashgari. Mehdi is one of the early pioneers of the silicon photonics industry with over 25 years of commercialization experience in this space. Silk is his third silicon photonics startup, focusing on advanced imaging solutions. His previous startups, Kotura and Bookum, both enabled successful exits.
Starting point is 00:00:39 Welcome to the podcast, Mehdi. Thank you so much, Daniel. It's a pleasure. First, may I ask how you got started in the semiconductor industry? Do you have a quick story you can share here? Yeah, sure. So I knew that I was interested in engineering, but as a young lad, it was hard for me to know exactly what I wanted to do in this space. So I decided to go for one of these engineering degrees that allows you to do a broad subject study. Back in Cambridge, there was a degree that just allowed you to do engineering, and in the second
Starting point is 00:01:12 year, you would choose which branch of engineering you wanted to do. So in the first year, we did everything from architecture to mechanical to electrical and so on. And I remember there was one of our profs, his name was Frank Payne, who was an amazing prof and he did fantastic lectures on semiconductors and optics. And he basically was the reason why I got so excited about how electrons and photons interacted and how optical devices and semiconductor devices worked. And I used to go to him at the end of the lecture and he would take me down to the library where only the profs could go and show me all these articles.
Starting point is 00:01:50 And that really was what piqued my interest in semiconductor. And the interaction with a very enthusiastic prof was really what grabbed me into semiconductors. Yeah, I had a similar experience. The excitement of the professors really made a big difference. So AI is a very popular topic. Let me ask, what role is AI playing in machine vision? And how does your LiDAR technology leverage AI? That's a great question. I guess there are two ways to look at this, right? One is to to say what does AI do for machine vision
Starting point is 00:02:26 and I'm actually more even more interested in what can machine vision do for AI so to answer your questions about what AI does for machine vision I will say the quick answer is that it allows us to teach our machine visions to do new things rather than program them. If you're able to, that's how humans learn, right? We kind of teach each other, we teach our kids rather than program them. And that means that the universal product can do many things, but we are being taught different processes.
Starting point is 00:03:02 So that's also what we can do. We make sure that our products have adequate spirit capacity so that also our customers can teach them to do different things or new things. There is obviously some limit to it because the hardware defines also what the product can do. In the same way that with humans, some people are better at doing certain things than others. So it's the same thing for machine vision. Now, in terms of what machine vision can do for AI, I would say that that's really exciting for me because I think machine vision can enable AI to do many, many more things, especially in terms of applications that currently are not within the scope of AI. Most AI tools today are really in the virtual world.
Starting point is 00:03:54 And, you know, you can't really get AI to build you a house or to do your housework for you or take care of the garden or drive your car and so on. And that's what we really want AI to do, to take physical form and do actual work for us. And I think that's what machine vision can do for AI, is to enable it to take physical form and actually help us in our physical world every day and contribute to our daily activities and the growth of our GDP in more ways than it is doing today. And how does your LiDAR technology leverage AI? Yeah, so we basically have a certain amount of programming inside the system, and we teach the machine vision to learn new things. So, for example, it can establish, it will use AI to establish what different objects are. The machine vision has a certain number of cues or information such as the polarization of the light that comes out, the intensity of the light, the velocity information, depth information, color information. And then we use AI to establish the
Starting point is 00:05:02 shape of the objects more accurately to know one object versus the other object, what these objects are, what are these objects doing. For example, in the case of a drone versus a bird, is this a drone or is it a bird? And what drone is it and what is the intention of the drone? Is it something we should worry about or not? So the AI would allow us to do a lot more application-oriented processing and get a lot more useful information from the images we create. Okay. You know, you mentioned drones. So counter UAS and perimeter security are on the rise. You know, drones are in the news.
Starting point is 00:05:42 Tell us how your technology addresses this growing demand. Yeah, absolutely. So here we are very proud of our collaboration with a company called Design, D-Z-Y-N-E. These guys are the leader in manufacturing drones, but also tracking drones and mitigating them using a whole host of technologies they have. We've created a partnership with them. We are working together where we are coming up with full systems that are able to detect drones, track drones, classify drones, and then mitigate them. Here, our LiDAR solution enables far more information about the object than a camera or a radar can.
Starting point is 00:06:24 So the system typically works with multiple sensors, a LiDAR being one of a number of other sensors. Other sensors can typically include a radar, which will be the first stage for detecting an object out there, maybe many kilometers away. There usually are cameras also involved in these sensors because humans still like to see things the way they're used to. And then the LiDAR that we add, adds a lot more information to it about the precise dimensions of the object which radar cannot give you, but our solution has far more resolution. It's about the velocity signatures of the object.
Starting point is 00:07:06 With our technology, we can actually see the velocity signature of a propeller, for example, of a drone. And that's a big cue compared to a bird, for example, the way the propeller on the drone rotates, the way birds swing their wings are very different. And you see that very clearly in a LiDAR velocity signature. And our systems are able to detect the drone, track a drone, and classify a drone at distances up to a couple of kilometers today, and we are working to extend that range redesign. And that means that once the radar has detected that there is some object out there, and the radar may not be able to exactly establish what it is, whether it's a bird or a drone high magnification image of the drone, and starts to establish information such as velocity and polarization so that you can do classification. And you can say whether it's a drone or a bird, and you could, in fact, even be able
Starting point is 00:08:16 to establish what type of drone is it. And based on that, you can then take mitigation action, whether you want to bring it down or whether you want to allow it to proceed to its destination. That's amazing. That's just amazing. So your company recently launched Ionic Trace, your first product targeted to industrial and warehouse automation. Can you tell us a little bit more about that? Yeah, absolutely. We're really excited about Trace, and it's built on a number of major trends in the industry. So first of all, COVID showed us how vulnerable entire logistic and delivery systems were.
Starting point is 00:08:53 And there was a major push after COVID to make a more robust logistics and delivery ecosystem. However, this was impaired by lack of working age population. You know, unfortunately, the birth rates have been dropping and a lot of people decided to retire after COVID and the industry has really been struggling to find the required workforce, especially for activities that require fairly harsh physical work in environments which quite often are not really convenient or very comfortable. You know, for example, no air conditioning in a warehouse.
Starting point is 00:09:29 So here there is a major push for automation. Automation of the whole logistics process, automation of the warehouse, you know, palletizing, filling up trucks, sorting and all that. And what is required for these is vision. Machine vision is a capability that makes a robot as efficient as a human. Today, if you don't really have the right vision capability for a robot,
Starting point is 00:09:59 the robot is only as 20 or 30% as efficient as a human, whereas you have to make the robot at least 90, 95 percent as efficient as a human for it to really pay off. So that is one major trend that we see significant use for trace and rapid growth. And the other is bringing manufacturing back to the U.S. We ourselves are doing a lot more manufacturing in the US and we are seeing progressively a larger number of people wanting to bring manufacturing to the US, but you can't really bring the same manufacturing style and methodology that you are engaged in offshore back into the US because of the cost structure differences we have here.
Starting point is 00:10:40 So you really need to allow for a lot more automation, and that automation requires machine vision of the highest quality. So we usually, we trace these addresses, the top end of the market where people have very stringent requirements. They want to have a precision in the millimeter to micrometer range. They want to have a working range, which is very flexible. Anything from say 10 centimeter to 10 meter working range so they don't have to worry about where they put the sensor and how big a field of view they want to look at and yet they don't want to compromise on performance and we also offer polarization information which is very key for certain activities where you want to know for example what the objects are.
Starting point is 00:11:25 And on top of that, of course, we allow for velocity. So Trace is really addressing all those markets. We want to make a product that is completely eye-safe. Trace would be the only product on the market today that is completely eye-safe. All the other products require consideration of eye safety, and they also have problems with multi-user interference. And because one sensor can blind the other sensor, with our devices, there is no chance that a sensor would interfere with another sensor. The chances of that happening is the same as me winning the lottery tonight. So that's very unlikely. And the other is that our sensors work equally well indoor and outdoor, again,
Starting point is 00:12:09 so that you can actually use this for a very broad range of applications. Right. You know, semiconductor manufacturing is an example of that because when I started, the fab was full of people and now there are no people and now we're bringing fabs back to the United States
Starting point is 00:12:22 and, you know, automation is critical, right? Absolutely. you know we we're not trying to get rid of people what you're trying to do is get jobs done that are not best done by people or people don't want to do them right you know so looking back a little bit um you had an investment from honda and i assume this has something to do with what you're just talking about. Can you give us an update of what's going on with this investment from Honda? Yeah, absolutely. We are very proud of the investment that Honda made in us, and we couldn't think of a more ideal partner or investor than Honda. As you know, Honda is one of the largest automotive manufacturers, but what really
Starting point is 00:13:06 makes Honda unique, in our opinion, is the fact that they do pretty much everything in mobility, right? Honda makes the largest number of mobility vehicles in the world, from cars to trucks to motorbikes to jets and airplanes and to marine equipment and gardening and robots. So they are a perfect partner for us to get our technology into mobility applications. And you can imagine that a large, highly technology-oriented company such as Honda would not make an investment like this lightly. They would look into the technology, make sure that it is the right technology out of all the other competitors, make sure that it can service their needs into the long-term future, and then it will have a plan to use it. Unfortunately, I can't comment on the details here because it's highly confidential,
Starting point is 00:13:59 but I can tell you that we are extremely excited about this, and we think that it will play a key role in the evolution and growth of our technology. And hopefully, it will be a technology that would benefit the society in a big way with the help from Honda. Yeah, I agree. Honda is just about everywhere, and they're very big into robotics and automation. So that's a good affirmation of your technology. So I have one more technology-related question, and this is in regards to your LIDAR. What makes frequency-modulated continuous wave LIDAR optimal versus maybe radar or other types of LIDAR? That's a great question.
Starting point is 00:14:41 So if you go back a little bit and look at how radar evolved, if you go back to say 30, 40 years ago where a lot of these radar systems were being deployed, they were also of the earlier technology called time of flight. And there is nothing wrong with time of flight, there are just certain limitations. And over time FMCW started to take over for radar applications. And the major drivers for that was higher performance in terms of precision, but also the immunity to multi-user interference. With the time-of-flight radar or LiDAR technology, if two systems start to have direct line of sight with each other or a reflection of one happens to shine into the other, it causes major interference. And that means that momentarily you go blind or you confuse the other guy's signal for a reflection of your own and you get the wrong information, the wrong image.
Starting point is 00:15:42 So this is a major problem when a lot of cars or a lot of devices start to use your technology. So once you have only one or two cars on the street using radar or LiDAR, maybe that's not a problem. But once every car on the street has a LiDAR inside or a lot of equipment in your home have LiDAR inside, then this multi-user interference becomes a major problem. It became such a problem that there was legislation introduced to actually ban the use of these time-of-flight systems, I think around 2003. And a lot of car companies actually had to switch at great cost from time-of-flight to
Starting point is 00:16:21 FMCW radar. Now, LiDAR has the same thing here. And I think one of the reasons why deployment of LiDARs in automotive industry, especially in the West, countries that have been making cars for a long time, has been a bit slow is because of this issue. People are waiting for FMCW to achieve the same level of maturity. Whereas some of the countries like China that didn't make automotives 30, 40 years ago are faster in deploying LiDAR because they don't have the same consents.
Starting point is 00:16:51 So what is the main difference between FMCW and time of flight is the precision, as well as the multi-user interference robustness of the technology. The chances that one of our LIDARs would interfere with the other is about one in 10 billion, which is very, very rare. Even if we make them in our own factory, it's a very, very low probability. The other key benefit is that we are often able to achieve 10 to 100x better precision than a time-of-flight system can. We can achieve longer ranges with lower power levels, and our solutions typically end up being far more eye safe than a time of flight does,
Starting point is 00:17:32 because we need lower powers to achieve it. And specifically, what is unique about our FMCW technology is our integration capability. We are able to integrate all the key optical functions necessary to make a very high performance FMCW LiDAR that on one hand can achieve kilometers of range, on the other micrometer levels of precision is our unique integration platform that allows us to make very unique high performance optical components all into a single silicon photonic chip which we can manufacture using a CMOS manufacturing steps very cost-effectively and therefore offer a high-performance cost-effective solution to our customers. Great explanation thank you Mehdi. A final question what's next for silk and you know what's on the horizon for 2025 and beyond?
Starting point is 00:18:27 Oh, great question. We are really excited about 2025. I think that this is a year where Silk should be able to start to mass manufacture a trace product. We are really excited about it. We have a waiting list of customers waiting to receive the product. We are sampling them existing development kits so that they know what to expect from Trace when it arrives. But we should be able to start shipping the actual products to them later this year. And we think it will be a very popular product. So we are in the process of setting up manufacturing,
Starting point is 00:19:02 high volume manufacturing capabilities, and sorting all our customers and interacting with them, getting them ready for the deployment of the product. That's really exciting. On the horizon, which is the product in collaboration with design for counter-drone applications, we are working very hard together with design to get get design wins in terms of counter UAS applications. We believe this year will be the year where we should get into many more government agencies and also commercial applications such as critical infrastructure, airports, and so on together. And in terms of the other aspect of the work, at the financial level, Silk is driving very hard to become cash break even as soon as possible. And I'm hoping that by the end of this year, we will get very close to cash break even.
Starting point is 00:19:56 I think that this has been a major challenge for many LIDAR industry players. And at Silk, we've been very careful about that cost and control that very carefully while we've invested in the key technologies and I hope that by the end of this year we will get very close to cash break even. That's great. Thank you for your time Mehdi. I really enjoyed the conversation. Hopefully we can have you back later on this year for an update.
Starting point is 00:20:21 I would love that. Thank you so much. I appreciate it and I enjoyed it too. That concludes our podcast. Thank you all for listening and have a great day.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.