SemiWiki.com - Podcast EP302: How MathWorks Tools Are Used in Semiconductor and IP Design with Cristian Macario
Episode Date: August 8, 2025Dan is joined by Cristian Macario, senior technical professional at MathWorks, where he leads global strategy for the semiconductor segment. With a background in electronics engineering and over 15 ye...ars of experience spanning semiconductor design, verification, and strategic marketing, Cristian bridges engineering … Read More
Transcript
Discussion (0)
Hello, my name is Daniel Nenny, founder of Semiwiki, the open forum for semiconductor professionals.
Welcome to the Semiconductor Insiders podcast series.
My guest today is Christian Macario, senior technical professional at MathWorks, where he leads the global strategy for the semiconductor segment,
with a background in electronics engineering and over 15 years of experience spanning semiconductor design
verification, and strategic marketing, Christian Bridges Engineering and Business to help customers
innovate using MathWorks tools. Welcome to the podcast, Christian. Thank you, Daniel. It's great to be
here. So, Christian, many semiconductor engineers first encountered MathWorks during their university
studies. In what ways are simulation and modeling tools, such as those from MathWorks, relevant to
their work in the semiconductor industry today? Well, we see the modeling and
data analysis tools such as Matlab and simuling turning very very useful when it
comes to defining the architecture of semiconductor IPs especially when it
comes to mixed signal IPs you might have heard of simuling that's a tool for
modeling and simulation and it for instance it allows to model both
continuous time and discrete time components so that means that you can easily
model analog and digital together at the very early phases of the design
flow before having a net list
And this helps validate your architecture before committing to Silicon.
Another area where data analysis tools like MATLA can help
is in analyzing circuit-level simulations and lab measurements.
Right. And what specific industry trends are driving the shift towards early behavioral modeling?
Is this adoption being led by certain segments such as AI, 5G, or automotive?
Well, we see a rising complexity of semiconductor devices,
and nowadays engineers often needs to optimize the interaction between analog and digital.
From my point of view, this is driven by mainly two trends.
One is AI and AI data centers and the other one is the rising integration between the analog world,
the real world and the digital world.
When it comes to AI and AI data centers, nowadays data centers need to move a large amount of data at a very high speed.
And for these, they need very efficient high-speed interconnects, which are typically
service-based, and they typically incorporate mechanisms like adaptive equalization, which is mainly
about compensating some analog effects with some digital algorithms.
And designing such systems requires to find the right mixed signal architecture and to choose
the right equalization schemes and digital algorithms.
And for this, architectural and behavioral modeling is key.
The other trend I mentioned is the integration between the digital world and the real world,
which is actually analog.
Well, we see these everywhere, right?
In cars with automated driving, in smart homes, IOTs, and so on.
All these systems that need devices such as ADCs and sensors.
And also these kind of devices, they need tight integration between analog and digital.
You can think of an accelerometer sensor mounted on a car which is running on a bumpy road.
You need quite sophisticated signal processing algorithms to extract some valuable information
from a sensor.
So again here, architectural model is not key to find the right mixing architecture and
the right compensation and equalization algorithms.
And how is the speed for high speed data transmission influencing the shift to this modeling
approach?
But you know, as data rate increases, engineers need to go beyond a circuit level approach.
They typically need to incorporate techniques from wireless communication like advanced adaptive
equalization, which is again about using a digital algorithm to compensate for channel
effects that change over time.
And this needs tight interaction between analog and digital.
And architectural model allows to choose the right topologies and identify, identify model next up front.
But if you think about this, Matlab and Simulink, they originally come from the control theory and signal processing domains.
And this makes them good tools to deal with analysis and simulation of systems which include feedback loops and signal processing algorithms.
And how does early behavioral modeling address the complexity of mixed signal interactions in advance
semiconductor designs well it may be allows to look at the at the design from a
system level perspective and analyze and simulate analog and digital paths
together six the very early design phases so before you have a net list for the
analog paths and RTL for the digital components so now tools like simuling they
allow to do continue start to model continuous type components and the speed time
components so they are well suited for this purpose. What we also see recently is that
engineers also model the environment around the semiconductor IP to validate the
architecture. What I mean is that for instance for high-speed data links you
want to model transmitter receiver as well as the channel and for a sensor you want to
model the environment that's going to be around the sensor to be able to generate
some realistic scenario, some realistic stimuli for the architecture and validate it deeply.
So, for instance, recently we have been publishing a technical article with an XP semiconductor
describing how they have been modeling realistic driving scenarios to validate the architecture of their radar transceiver.
All these approaches allow to validate architecture before committing to silicon
and minimize the risk of architectural issues.
issues. Okay, so how do engineers ensure that the behavioral models are sufficiently accurate and
representative of the final silicon? That's a great question. We can think of three modeling phases,
each of them having different needs for accuracy. The architectural phase, the pre-silicon phase,
and the post-silicon phase. In the architectural phase, that's where the mixed signal architecture
is being defined. So typically engineers start modeling the different components in an
ideal way and then step by step including non-ideal behaviors. And these non-ideal
behaviors they typically come from previous projects from the experience or from some data
about fabrication process. That's what you can do at this stage, right, since you don't
have a net list yet, at least in most of the cases. And the outcome of this phase is the design
specification for all the various components of our system. During the pre-silicon phase, well
that's the phase where the different components are being implemented, so the analog
component in terms of net list and the digital components in terms of RTL code. In this phase,
you typically want to align your architectural models with the implementation. So for instance,
if you have a C curves for some analog components you can do some data analysis to extract poles and zeros
and ensure that in your architectural models you have the right poles and zeros in place that reflect the real
implementation of these components if you have no linearities you can make a polynomial fit for your
for these normal linearities and feed it to the architectural model so to ensure again the
between the architectural description of your system and the net list.
And this allows you to refine your architecture if needed.
If you're lucky, you might even be able to relax some specifications and some requirements.
In other cases you might have, you might figure out that you would need
some more advanced digital compensation algorithms
to compensate for some analog effects that you were not expecting.
Finally, in the post-silicon phase, you want to use
ensure the correlation between your architectural models and your silicon so what
you do is that you take measurements for your from from the lab from your
characterization and ensure that your architectural models correctly
represent the silicon and after you do that what engineering teams typically
typically do or semiconductor companies typically do is to exchange these with
their customers as a form of executable specification we see mainly
two formats for these models. For sure these systems it's typically exchange what's
called IBCMI model which is a standard for describing this kind of devices. And for
the other kind of components like ADC is a same source etc. What we see is simuline
models being exchanged. For instance if you go to analog devices website you can see
the big catalog of simulite models for ADCs, sensors and many of their devices.
And what are the potential risks if early behavioral modeling is skipped or poorly executed?
I mean, how do companies mitigate those risks during the design process?
Well, if you don't do architectural modeling well, especially nowadays,
the risk might be suboptimal design, so you might end up over designing some parts of your system,
or having some incompatibility, so some functional issues in maybe in some corner cases.
Or UMRI run into delays, typically due to some architectural issues found late in the design flow
and having to go back to the architecture, refine it and redesign or and update your design.
And even silicon wristpins could be caused by poorly executed architectural modeling.
How early in the design cycle can engineer,
effectively use behavioral modeling to identify potential issues with mixed signal
interactions or performance bottlenecks well basically as soon as requirements are
available for instance if you use simulink as a modeling platform
simulik offers a big catalog of building blocks for analog components like filters
charge pumps prescalers and so on and also for digital components like digital filters
ffts and many DSP digital signal processing algorithms typically used commonly used and this
allows engineers to quickly build their architecture and start their design space exploration
and then over time they will be able to add non-ideal behaviors like no linearities that we
discussed previously noise and so on and moreover for commonly designed systems such as
analog to digital converters, PLLs, and chips, well, IP is for various services standards.
Reference design are available to help engineers jumpstart their design and the architecture explorations.
Final question, Christian, how do MathWorks tools handle the transition from early behavioral models to final transistor level of design?
You know, what does the path look like from simulation to hardware?
That's a great question. Well, if you develop architectural modeling using simuling, they can be
reused within EDA simulators, both analog and mixed signal simulators and digital
simulators, RTA simulator basically. And this is very helpful, especially what engineers are
typically doing is to generate verification components from simuling to be integrated within
analog and digital simulators. For instance if you have a you can take a charge pump model in
your architectural description and export it for analog simulators so that it can be used as a
reference model or you can export this charge pump model as a model to be integrated into digital
simulations and in this case it's a good way to bring analog models within digital
simulation and do digital mix signal simulations. Now there are
mainly two techniques that we offer for this purpose.
One, as I was anticipating, is to generate or export
system very log components or system value of classes
from architectural models.
And these components can then be integrating
to analog or digital simulators as a behavioral models, basically.
And the other technique is what we call co-simulation,
where part of your system gets simulated within simulings,
so using architectural models.
and the rest of it runs within mixed signal simulator or an RTL simulator.
So using the real net list or the corresponding RTL.
All these techniques allow to avoid a duplication of effort.
So you develop your models once in the very early stages of your design flow
and you reuse them throughout your design flow.
And what's interesting is that this also enables better cross-team collaboration
because analog teams and digital teams are synchronized
and are you reusing the same architectural modeling.
Great. It's a pleasure meeting you, Christian. Great conversation
and hope to have you on again sometime. Great. Thank you very much. It was great to be here.
That concludes our podcast. Thank you all for listening and have a great day.
Thank you.