Embedded - 202: Flush and Your Inner Fish
Episode Date: May 31, 2017Professor Alex Dean spoke with us about his ARM embedded systems books and @NCState courses. Alex’s page in North Carolina State University’s department of Electrical and Computer Engineering. His... book is Embedded Systems Fundamentals with Arm Cortex M Based Microcontrollers: A Practical Approach (ecopy available from the ARM Media site). It uses the FRDM-KL25Z as the example board throughout the text. Alex also co-authored Embedded Systems, An Introduction Using the Renesas RX62N His favorite RTOS is Keil RTX.  We also mentioned about Your Inner Fish: A Journey into the 3.5-Billion-Year History of the Human Body by Neil Shubin and Flush by Carl Hiaasen
Transcript
Discussion (0)
Welcome to Embedded.
I'm Alicia White, here with Christopher White.
I'm happy to have Professor Alex Dean on the show this week to talk about his career,
his classes at North Carolina State University, and his new book titled
Embedded Systems Fundamentals with Arm Cortex-M Based Microcontrollers,
A Practical Approach.
Good job.
So many things to talk about, but I do have a few notes about our show first.
Digilent sent me a coupon worth 15% off the analog and digital discoveries.
It is embeddedfm15, all in lower cases. It's good for the month of June.
It's not as big a discount as academic pricing. So use your students or your student ID cards
to get a better price, a really good price and surveys. Don't forget to fill out the surveys
going to be done by June 2nd. And we'll know more about you and what you like about us.
There'll be a link in the show notes. Alex, thanks for joining us today. My pleasure. Could you tell
us a little bit about yourself? Sure. So I am an associate professor in the electrical and computer
engineering department at North Carolina State University in Raleigh. I've been there since 2000. I teach and do research in the area
of embedded systems. And I've worked in industry, actually as a break during my grad school
experience, and also as a consultant starting in 2000 doing design reviews for embedded software
for industrial control applications and consumer electronics.
And I've written a few books.
I've developed course materials for Renesas and ARM and imagination technologies and microchip,
all on embedded systems.
And the textbooks I wrote were for Renesas and now ARM.
Excellent.
So I believe you are familiar with lightning round, where we ask you short questions and want short answers.
And if we are behaving ourselves, we won't ask you for lots of detail afterwards.
I like the idea of brevity.
Okay, Chris, do you want to go first?
No, I don't. You always make me go first. Now I'm going to make you go first. Okay, favorite movie or book or work of fiction that you encountered for the first time in the last year?
That fiction thing really trips me up because I really enjoyed a nonfiction book called Your Inner Fish, which talks about evolution.
I haven't read a whole lot of fiction in the past year, but I'd say it would have to be Carl Hyasson's latest book, which was, oh, I can't remember.
Anyhow, fun stuff, set in Florida.
All right.
Well, usually, yes.
Flush.
That's the one it was.
Flush.
It was funny.
Flush and Your Inner Fish together, both.
Those, yeah. It was funny. Flush and your inner fish together, both. Preferred voltage?
2.7.
All right.
It's a little nontraditional, but pretty, yeah.
All right.
I'd like to go lower, but LEDs, displays, not available.
Do you believe in starting many projects and maybe not finishing them or
focusing it on one project and getting to the finish line on that before starting another
i by necessity have to work on multiple projects simultaneously and i've tried to reduce the
number that i work on down to a manageable number, say three or five or seven.
Favorite animal?
Labrador retriever dog.
Favorite RTOS?
That one's tricky.
Probably RTX, but it's changing.
There are several I've worked with, and yeah, call it RTX.
Tip or trick you think everyone should know?
Look at what's really happening to figure out how to fix it.
All right.
I'm pretty good at fooling myself, so yes, that's a good advice.
I know this is breaking this way.
I'm thinking about optimization, debugging, all that.
Yeah.
I know that you are very familiar with Arduinos,
so I have a two-part question.
What do you like least about Arduinosinos and what do you like most about them?
Arduinos have an advantage of bringing a lot of non-traditional people into the embedded system development field.
That's the plus. The minus is that the design environment places a lot of implicit
constraints on what you do and how you do it. And that makes the people think in a certain way and
closes their eyes to other options, other opportunities for how things are actually done in industry.
That's a good answer.
I agree with the implicit part.
And they aren't always good constraints.
I mean, they aren't always even real.
It's just how it makes you, it forces you down a path and puts blinders on you
so that you can't see things like low power and optimization or even debugging properly.
Multi-threading is my own personal bugbear about Arduinos, but we all have our favorites. So you teach at a university and you teach embedded systems.
And this is such a new concept for me that someone teaches this.
In my day, when we went to school, it was CS or engineering.
And there wasn't a lot of actual instruction on what I ended up doing for a career.
Is this a new field or are you pioneering it?
Back when I started in 2000, embedded systems was the big and hot topic and lots of people
were excited about it and they were trying to define what it was. And I was lucky enough to
graduate and gotten on the market right then.
So I got my undergrad in 91. I got my master's in 94 and PhD in 2000. But I was timed right to
be able to go into the field that I love, which is embedded systems. Embedded systems
classes in universities tend to be a bit, wait a minute, are we still in lightning ground?
No, no, no, sorry.
Okay, okay, good. My watchdog timer went off there.
Embedded systems, it's a challenge.
There are a lot of exciting things you can do with microcontrollers. rather than looking at the infrastructure of the embedded system itself and how you design embedded systems and what are good design approaches,
what are bad design approaches for a given scenario.
So I think there are a lot of people who do embedded systems work,
embedded systems teaching, and classes on how to use microcontrollers for various things.
However, there's less pure embedded systems research, if you want to call it that.
And that comes from a few different reasons, but that's my high-level answer.
Okay, what kind of research are you working on?
Embedded systems. So I've been focusing on infrastructure
for embedded systems, meaning developing new ways of doing things that can be applied to a wide
range of applications. So rather than developing a specific microcontroller design for a certain
application,
I've been looking at other things.
I started off with something called software thread integration,
which is a compiler method that I came up with that takes code from two different functions
and merges them together to make one implicitly multi-threaded function
that gives you basically zero-cost context switching.
So you can have one software thread that does multiple things, say some might be time-critical,
but there's no performance overhead for that.
So that's something I started off with.
Since then, I moved into real-time systems, real-time kernels, scheduling, predictable memory systems.
So the problem with caches is that usually they hit, sometimes they miss.
So sometimes your access takes one cycle, sometimes it takes 20 cycles.
If you're building a real-time system that has to complete its work within a certain amount of time,
you have to design for the worst case, which presumably would be all memory accesses are misses.
And yeah, and that's a real problem.
So there are plenty of people who have worked on cache analysis to characterize accesses as hits or misses, I looked into using scratchpad memory, which is basically a fast area,
well, an area of RAM that's fast. So on-chip RAM. And the compiler can see it, and the compiler takes advantage of it to put frequently used data there. So my students and I developed methods to
identify frequently used data, so hot data, and move it into the scratchpad memory or locate it in the scratchpad memory.
This also included splitting up a stack frame, putting the hot variables in the hot stack and cold variables in the cold stack.
So we developed tools to analyze and modify code to do that. And then step three, the current research phase is something
wild and crazy, which is looking at lowering the cost of implementing switch mode power converters.
These are like the buck converters or boost converters, things that you find in power bricks, power supplies.
They're very efficient at converting voltages from one level to another.
And they typically have two sections.
One is a controller and the other is a power stage, which has power transistors and things like that.
So my work involves taking the controller and putting that
into software onto the microcontroller that's already in your system and using real-time
system design and analysis methods to make sure that that control loop runs when it needs to run
to ensure that that output voltage is regulated. And that gives you something that's really cheap
because you've gotten rid of a whole bunch of hardware. And it's running on the software instead.
And this makes it easy to put in multiple voltage converters.
So you might have multiple voltage domains.
You might use this to drive motors more efficiently, drive high-brightness LEDs at a constant current,
charge batteries with constant current or constant voltage profiles.
So there are a lot of applications there.
Yeah, there really are.
I'm just going through my, I'm thinking about all the, but I, wow.
Yes.
Okay.
Well, I'm like, okay, so for a motor, often we PWM it, but then that has the disadvantage of being a square wave signal, which can be messy and noisy.
But if I could modify the voltage so that I could just say on or off and actually have a voltage that was less to go slower or whatever I needed, why ever I needed the PWM.
I could see how that would be very useful.
As well as all of the problems with starting the power subsystems
in the right order,
which seems to be something that
is just painful sometimes.
The data sheets often say things like that.
It's like, you have to sequence them this way.
Otherwise, there could be irreparable damage. Yes. That's always the irreparable damage that happens on the very
first prototype. So you have been teaching for quite a while. And do you think of embedded
systems more as software or hardware? Do you see more students coming into it with a software perspective or an electrical engineering perspective?
Well, I am in the electrical and computer engineering department.
So the majority of the students in my classes are ECE majors, but we do get plenty of computer science students, some applied math or applied physics, I guess.
But by and large, it's mostly ECE students.
And they tend to, there are some that are more hardware
and some that are more software.
It depends on what background they have.
Our undergraduate students are probably going to be,
I guess, a more even balance between hardware and software.
Our international students, you know, actually it varies. Some of them are doing embedded systems
after doing ASIC design. So they have a, you know, they're thinking about Verilog
while other students come at it from more of a software point of view.
So I can't really give a good answer.
There are a lot of different data points out there.
Just like the field, it's kind of a mix of things.
I found with writing my book that I spent a lot of time thinking, well, somebody who has an electrical engineering background is going to think that's obvious.
Or someone with a software background is going to think that that's completely, of course everybody knows that.
And I did want to essentially make it okay with myself that they were going to understand half of it and they were going to learn half of it.
So the book was only half useful.
That's a good way to look at it.
It helped me get over my hurdles of how am I going to explain this to somebody who already might know part of it.
With your book, you go through it in a very hands-on approach way.
Can you give us an introduction to what your book is and what you like best about it? you go through it in a very hands-on approach way.
Can you give us an introduction to what your book is and what you like best about it?
Sure.
So the book is an introduction to creating embedded systems
or designing embedded systems.
And the target audience is an undergraduate student
that already has experience in C programming
and has preferably had a digital logic class
and basics of circuit theory
so they can understand how to read a schematic
and look at diodes and C currents and things like that.
So do you see that these readers are going to be your students?
I mean, is this a class lecture series? So yes, the textbook is targeting the students in my
classes. I'm actually not teaching the introduction to embedded systems class at our university because we've got a great adjunct who does his version of
the course. But the challenge with microcontrollers is that the peripherals are different across
different manufacturers and different families. So the students in our university, they start off learning MSP430 in the Introduction to Embedded Systems class.
But then in the next embedded classes, the Embedded microcontroller and the peripherals for a different one.
But in the advanced classes, we're using a Cortex-M based microcontroller and the peripherals are completely different.
They're not TI, they're NXP, formerly Freescale. So the textbook provides a great way
for them to get up to speed and understand better how those peripherals work. The other big part,
really, with the book, though, is the discussion of concurrency and how do you share the processor's time among multiple pieces of software.
And that's really a unique aspect of the book that I really like.
Okay, so there's interrupts.
I mean, that's one way to give time to certain events.
And there's the main loop, if we want to go that way,
in which case we're just handling events as they come in through interrupts.
But there's also schedulers or system ticks that kick off certain things at certain times.
Is that what you mean by concurrency, or is there something more?
Well, it can be split into two categories, work that gets done by software and work that
gets done by software and work that gets done by hardware.
And for embedded systems, both of those are important because the microcontrollers are different from microprocessors just because of, well, mainly because of the vast array of
peripherals that they have that can work autonomously. So these are essentially digital
logic circuits, mostly digital logic circuits, that can go and do work
and take it off the shoulders of the CPU so the software doesn't have to be nearly as complex.
But this is like my spy driver or my spy component of my hardware so that I don't have to
send out the, so I don't have to toggle the GPIO line at some clock frequency.
Right. So that's,, so that's an example.
And there are plenty of...
So SPI is...
Some of the things peripherals do just make it easier for the CPU.
Other things
do things that are impossible
for the CPU to do.
Other peripherals do
things so that the CPU...
So that you can build a complex system and compose it from multiple components and not have a massive challenge of trying to make sure everything is happening at the right time.
So like DMA, where it happens in the background, and if I'm driving a motor right now, I don't have to worry about a transfer happening because I literally could not do both at the same time.
Right, right.
And there are so many beautiful examples related with digital power conversion showing how incredibly valuable the peripherals are.
So the digital control, well, the digital power conversion circuits are driven by a pulse width modulated signal.
And you've got to measure, say, the output voltage.
Well, that output voltage is going to vary with the PWM signal.
So if your timing isn't right, sometimes you'll be sampling it at the high point.
Sometimes you'll be sampling it halfway down.
Sometimes you'll be sampling it a little farther down.
So you'll get a whole bunch of noise. So what you need is to synchronize your sampling by the ADD converter with the PWM signal. So the peripherals typically have an
interconnection method that allow that synchronization to be done automatically in hardware and because of that the software
doesn't have to be nearly as precise in its timing in order for the system to work properly.
Yeah and you don't have to do a bunch of filtering to try to get rid of this noise that
was caused just because you didn't link the peripherals together.
Right and you know another advantage of not having to do the filtering is the system is much
more responsive because the filter introduces delay.
Right.
RAM and delay and code, it's just the filter is pointless if you can't avoid it.
If you can't, then it's all very important.
Yeah.
Yeah.
So that's the hardware side.
The software side, it's a wide open field. That's sort of the double-edged sword of software development. You can do anything with it, right? than others. Some of them scale up to big programs better than others do. And the scheduler is really
a key concept for how you build the software. How are you going to allocate the processor's time to
these different pieces of the program? So in the book, in chapter three, I get into concurrency and how we get that in an embedded system.
And I start off with a simple program that reads switches and uses them to determine how to flash LEDs.
So an RGB LED, are we flashing red, green, blue, or flashing white and black, on and
off?
And the switches control the switching pattern or the flashing pattern.
And I step through a few. I start with the basic program where it's all in a while one loop. And
then I break that software into tasks so that the switch reading is separate from the LED flashing.
Then we look at scheduling the tasks cooperatively so that first we read the switches, then we flash the LEDs, then we read the switches again and we go back.
And we see that there's some time delay there between when you press the switch and when the LED flashing pattern changes.
And that's because the two tasks are running cooperatively.
So one doesn't press the other.
The task that reads the switches doesn't actually see that the switch has been pressed until that code runs.
So there can be a delay there.
So then we look at how to improve the responsiveness of these tasks by basically shortening the tasks.
So breaking them up into state machines so that you can say, well, you know, this task is only going to take
at most two milliseconds every time I call it. I might have to call it several times to step
through all the work, but the work will get done, but it won't delay other things excessively.
And then after that, we look into using interrupts and event-driven software so that when we press the switch,
an interrupt is generated and the ISR runs,
and that ISR is going to set a global variable that says to the task,
hey, I want you to start flashing the LEDs quickly.
Eventually, when the LED flashing code starts running again,
it'll check that global variable and say,
oh, I should flash things quickly.
So that cuts down on the response time significantly. We also look at how to
go up with the next step and prioritize the tasks. Instead of doing them round robin,
A, B, A, B, A, B, we look at saying, okay, I want task A to have higher priority than task B, and the scheduler then will run task A if it's ready.
If it's not, then it'll run B if that's ready.
And then finally, we introduce the concepts of preemption, so tasks can preempt each other. And that really sets the stage for students to understand how to structure their
software so that it's made of these separate pieces that are as independent as can be.
And that makes them much more maintainable and developable, I guess, composable. But it also
then gives them a foundation for learning about a real-time
operating system where you start with this understanding of independent tasks, and then
you add concepts of how do they communicate with each other, how do they synchronize with each
other, and that's a different course. Now you've got signals and semaphores.
Yes. You've walked through all of this and it
sounds like you're nearly teaching them how to build an operating system. Is that something they
need to know? Well, no. Most of them do not need to know how to build an operating system.
I mean, because we're long past the days of build your own operating system.
Well, but on the other hand, if you don't understand these concepts, I would argue you
can't use an operating system effectively.
Yes.
Right?
I mean.
Yes, but we should let him say that.
Yeah.
Okay.
What he said.
A lot of these consequences are, a lot of these concepts are lost in the abstractions that people have of how an operating system works.
So, you know, semaphores, how are those used?
What do those provide?
Mutexes, what do those do?
Those are typically presented from the computer science point of view, where you've got a lot of abstraction, you've got a resource-rich compute platform that has lots of memory and lots of megahertz. We tend to use synchronization between tasks or threads in a different way than most CS people do, just because our applications are quite different and we're talking, and even quite a lot on assembly.
Why is that important to modern developers?
I mean, do people really need to understand these things in order to use them?
If their code doesn't need to work, then it doesn't really matter.
That's 90% of everybody's code.
Yes.
Yeah.
So the fundamental reason that the software developers or embedded system developers need to understand the processor architecture and its assembly language is that the programs that they write in C aren't executed as C code.
They're translated into assembly code.
And that's the language of the processor.
And that's what the processor executes.
So if you really want to understand what the processor is doing,
you can't just look at the C code
because then you have to mentally, well, you're interpreting what isn't actually happening.
What you really need to do is look at the assembly code because that is what the compiler has generated to satisfy the requirement of the C program, which is the specification for the system.
When you look at the assembly code, then you can realize, oh, I'm getting an integer back from this function and I'm treating it as a float. Oh, that's why my program crashes.
But if you look at the C code, that probably isn't very obvious. So debugging is one reason why
you want to understand the processor architecture in the assembly language. Another reason is so that
you can write more efficient code so that you do things the processor can do more naturally and
more easily. And finally, performance optimization. You want the processor to do things as quickly as
possible. And trying to optimize code that's written in C is sort of like trying to type on a keyboard wearing oven mitts.
You know, you don't have that much control. If you take a look at the object code,
so the assembly code that the compiler creates from your C code, then you can realize, oh,
the compiler is doing this extra work. It must think that it's possible for this variable to be
modified by something else. Therefore, it's doing an extra save and then load.
And, well, let's see, how can I change my code
to tell the compiler it doesn't need to do that?
Similarly, typecasting.
If we have mixed type calculations,
an integer times a float,
the integer gets promoted into a float
and then the multiplication gets done as a float.
That's not obvious from the C source code, but once you look at the object code, you
see the subroutine calls that perform the type conversion and it's painfully obvious
what's happening.
I can see how being able to read the assembly would make for a better C programmer.
Do you have students who want to go out into industry
as assembly programmers after taking your courses?
Some do, yes.
Sometimes assembly code is what you really need to program in
in order to get great performance.
One example of that is the advanced SIMD
instruction set for the ARM Cortex-A
series processors, so Cortex-A8 and later.
These give you single instruction, multiple data instructions,
which let you work on 128 bits of data at a time.
Those instructions,
the compiler probably isn't going to generate those instructions
unless you're really, really lucky.
So you need to tell the compiler how to use those instructions,
how to generate them,
either using intrinsics or ARM C language extensions,
or you need to write assembly code yourself.
And doing assembly code yourself is something that gives you complete control over what's happening.
It does, and it's fun.
But I hope that most people who are listening realize that that is a 0.1% problem, not a 99% problem.
I mean, most of the industry stuff i work on i read the assembly
sometimes especially for time critical things but i don't i don't i haven't programmed an assembly
in years now that i think about it i guess i did some trinket stuff that had to be in assembly for
timing reasons yeah well and and like alex said, the special instructions tend to be the place where it's most necessary these days,
depending on your compiler and how smart it is.
But, you know, a lot of times a new thing comes out, a new extension or something,
and, you know, you've got to either wait for the compiler to catch up or it never catches up,
so you kind of have to do it yourself.
But yeah, I mean, I think assembly is important.
And it's different degrees of important depending on what you're doing.
For some things, it's critical.
For other things, it doesn't matter at all.
And that's where Donald Knuth comes in. Or Donald Knuth comes in.
Premature optimization is the root of nearly all evil.
Really, the goal, the better way to write software is you write it in a high-level language like C
that lets you get it working quickly, and then you find out if it's fast enough or not. And if it's
fast enough, great, you're done. And you probably finished 10 or 100 times faster than if you had done it in assembly.
If it's not fast enough, you profile it and you find which parts are taking up most of the execution time.
And you look at the source code for that.
And maybe you'll be able to tell from the source code what's going on, why that code is taking so long.
But maybe you can't.
And in that case, you need to look at the assembly code.
And then it tends to be obvious what's going on, why that code is taking so long.
And then you can hopefully tune the C code to help the compiler produce better object code.
Or maybe you have to go directly to assembly code there.
Yeah. And as Christopher took away my whole point to this section.
Sorry.
It really does help to understand how things work, to use them.
Exactly.
Writing the book, was it something you sat down one day and said, I'm going to write
a book? Was it something that came out of your classes and you looked at this giant pile of paper and said, you know what, with some editing, that could be a great book?
How did you get there?
What path did you take?
Well, I had previously developed course materials for the Arm University Program's Embedded Systems Education Kit.
And those materials were a great starting point for writing the textbook.
Previously, those course materials actually are an evolved version of the course that I started teaching back in 2000. And it's grown over the
decade and a half and changed and the emphasis has changed, but it's been something that
has evolved. And I realized that not having a textbook really puts the students at a disadvantage because they're the ones who aren't able to
come to all the lectures and pay attention all the way through are at a disadvantage.
But if they have a textbook that they can read, the textbook is able to provide a narrative,
a story that walks them through the steps and explains why things are important.
And it presents the same information,
but in mostly the same information,
but in a different way.
And it really helps as a supplementary learning resource.
It's the supplementary learning resource
that I find undervalued by most people.
We have a listener, and I'm not going to say his name
because now I'm going to
suggest that I don't agree with his path, who wants to teach an electronics course to high school students. And he's done a great job of reviewing all of the available source materials.
And at the end, he said he was thinking about taking one chapter from each, whatever was most pertinent. And I haven't
responded yet, but I kind of want to say, well, all the notation is going to be different. And
as learners, you're not going to be, they are not going to be able to just ignore the notational
differences, but worse in a year when they actually want to do something with this, they
don't have anything to go back to. A solid,
finished book not only has the care and thought of how am I going to present this,
it has that returnability to it that, oh, I remember I saw this somewhere. Oh, right,
it was in that class. It must be in that book. And then there's the aspect of if it's on your shelf, if it's on my shelf and you walk by, you have an expectation that we can talk about that book and that we have a shared background and shared knowledge. So I really like those pieces as a way to really make a good case for making a book about these things.
I agree.
I definitely agree there.
There are all sorts of, you know, data books, data sheets available, programming manuals,
blog posts, you know, things on the web, but they're not coherent and they're not exactly,
they're not solving the exact question, you know, the problem that we're looking for the answer to.
Oh, yeah. And I mean, sometimes the blog posts are nice because they are the exact problem you have,
but they aren't necessarily the stream of information. I mean, I know you know Chris
who is one of the contributors to the Embedded FM blog, and he's been doing a series. And one of
the things we've talked a lot about is the progression of the series.
How do you go from point A to point B?
I mean, how do you explain assembly if you have to explain hexadecimal first?
Like, where do you draw the lines and how do you make an order?
It's tough.
It really is, I know.
How did you go about that?
Did you have an idea from your classes, or did you think about it for your book especially?
My vision was, or my expectations for the students' background were based on prerequisite
courses that we've had for our Intro to Embedded Systems course.
So students have typically had an introduction
to computer organization class.
So they've seen assembly programming
and they've seen computer organization,
they've seen some digital logic design and so forth.
So I expected them to be familiar with hexadecimal and C programming and
the concept of assembly language versus C.
But yeah, it was based on what the prerequisites were for the classes that I teach.
And yet it does seem to be not necessarily specific to your environment it was a general enough book that there was a lot
it was a good pattern lots of laid out maybe you can't start from oh my god i've never seen a
computer but you could start at the beginning and work your way through and look up what you
needed to it made a lot of sense thank you yeah i i um i put the introductory, well, so the first chapter, what's an embedded system?
Here are some examples.
What do they have to do?
How are they different from general purpose processors?
But after that, I wanted to get into the hands-on aspect of blinking an LED because that's really the hello world for embedded systems
and i thought okay so how how do we get there as soon as possible because i want to
set the hook and you know get the students interested before they have to go before they
have to slog through the more theoretical and you know abstract concepts so that's that's why i laid out the book the way i
did with um the you know hands-on led flashing and switch reading right in chapter two before
we get into the cpu sorry before we get into concurrency which is chapter three and then
the cpu and interrupts are in chapter four.
And unlike my book where I totally said, no, I'm not going to talk about language. I'm not going to talk about a specific processor. I'm going to talk about generalities and
try to build these bridges. You went ahead and said, this book uses the Freedom KL25Z,
the, I want to say Freescale, but it's NXP, but a particular MCU development board.
What made you choose that one?
So I chose that because that was the target platform that ARM had selected when they asked me to develop course materials.
So they were looking at several different Cortex M0 Plus
based microcontrollers, and they ended up with the Freescale KL25Z, which now is NXP,
but this year I believe Qualcomm is purchasing them. So anyhow, use a pencil. So that's why I
chose the processor for this book, because I already had the material that I had developed and I've been teaching for several years at NC State.
Prior to that, we were using processors from Renaissance.
Maybe Qualcomm will let them have some vowels in their name.
I hope so.
Fruidum.
Yeah.
How did you initially get hooked up with ARM? I mean, you were teaching classes with them, but did you approach them and say, okay, I have an idea to put these classes in a book form, or did they approach you? How did the editorial process work? I'm always interested in other author stories. Okay. Well, when we started discussing the possibility of developing course materials, I said, you know, if you really want to do the job right, you also need a textbook.
Because the course materials, okay, those are going to help the instructors that want to adopt a course and teach it. But really, if there's a textbook that goes along with the course materials,
it's so much more useful and it's a much more solid package
than if you just have PowerPoint slides, homework solutions, example projects,
an outline, exams.
That's only part of the course.
Having a textbook really helps the student and the instructor understand what's going on.
So we discussed that up at the beginning, up at the front.
So the textbook was essentially a follow-on project to the course development.
And did they do a lot of editing?
No. No, they did not.
There are... No, the typesetters changed some things around.
It was pretty funny.
They did not like the contraction lets,
so they kept changing it to let us,
and I had to change it back.
But no, they did very little editing, which was good.
It's tough.
I mean, I know that Andy Oram made my book quite a lot better.
Because he took a look at it with beginner eyes.
Well-educated beginner eyes.
But still, it was new to him.
And he definitely made my grammar better.
And apparently, the typesetters wanted a lot more commas than I am accustomed to.
But yeah, trying to work with another editor for a different project.
It was a lot of editing.
I wasn't into it anymore.
Not without the discussions.
So there's good and bad to having them have a light hand.
Yes.
It wasn't available, your book.
It wasn't available when I looked today.
When will it be available?
It should be available any day now.
The electronic version is available, and you can get that from, oh, let's see, I binding, and it should be resolved by the end of May, if not earlier June.
Okay, well, this will go up early June, so it'll be fine.
Right.
And that'll be available on Amazon, all the normal book-selling locations?
Yes.
Cool.
And we will have a link to that, of course.
So Chris Feck told me about a project you were giving in one of your embedded optimization classes about energy savings.
Can you tell us about this?
Sure, sure. This was the last project in the course, and
the idea is we're using the Freedom KL25Z board. So this is a microcontroller board
from NXP. It has an ARM Cortex-M0 Plus processor core in a Kinetis KL25Z processor, 16 kilobytes of RAM, 48 megahertz, 128 kilobytes of ROM.
Anyhow, the board has a three-axis accelerometer that you can talk to through I2C. an RGB LED and the project was requiring the students to optimize the energy
consumption of a program that flashes the LED twice a second based on if the
board is tilted past 15 or 30 degrees so it's color-coded indicating the level of
tilt and I gave them a starter program and i said okay you've got a week and
a half or two weeks to make this thing run as long as possible off of a 0.33 farad ultra capacitor
and you know the ultra capacitor will be charged up to 2.75 volts and your goal is to make the
system run as long as possible and you know code, yeah, the code's not terribly efficient.
It works.
Maybe you should optimize the code,
but there are a lot of other things that you should look at first.
So they ended up taking this code
and, well, first off, speeding up the I2C communications
as much as possible
so that then the processor could be put into a sleep mode for longer.
Turning off all the peripherals that were not used.
Choosing a clocking mode that allowed the processor to wake up really quickly. The starter code that I had given them took, I don't know, 800 microseconds to wake up.
Something ridiculous.
And that wasn't intentional.
It was just something I never got around to looking into until later this semester.
So anyhow, it turns out there's a clocking mode that lets you wake up.
It doesn't have to start up the phase lock loop.
And anyhow, so changing the a clocking mode that lets you wake up. It doesn't have to start up the phase lock loop. And anyhow, so changing the processor clocking mode, putting the accelerometer peripheral into this auto sleep mode.
Oh, yeah.
And then flashing the LED.
Well, the LED doesn't, the LED just has to be barely visible in normal room light.
So you don't have to have it on for 10 milliseconds. It turns
out that eight microseconds is enough if you're up there around 2.8 volts. But as the voltage drops,
you need to keep it on longer because the light is dimmer. And of course, we're looking for three
colors, red, green, and yellow. And the yellow is made by turning on the red and the green LEDs
simultaneously. Anyhow, the green LED has a large forward voltage. So that's the one that fades out
sooner than the red. So you actually have to turn on the green LED for longer for it to be visible
at a given voltage compared to the red LED. So all these neat factors that, you know, multiple
dimensions that come into play, trying to squeeze as much energy out of the ultracapacitor as
possible and trying to make the system work at as low a voltage as possible. It's a lot of fun.
And the software aspects of it are by no means everything.
And computer science students who have taken the class have occasionally said to me, but it's not fair.
You know, there's all this hardware stuff that you can optimize with, too.
Well, that's the whole point of embedded systems.
It's not just software.
Exactly.
And, you know, I was having fun with the students.
I threw out a whole bunch of ideas during the lecture saying, okay, maybe you could do this, maybe you could do that, you know.
What about this series resistor that controls the amount of current through the LED?
Well, that's just burning power.
You know, what if you reduce that current?
Sorry, what if you reduce that resistance so that you have less energy, less power dissipated in the resistor, but you turn on the LED for less time?
Maybe that would be more efficient.
To a point.
Don't blow up your LED.
Yeah, yeah, exactly.
But, you know, I don't know.
I haven't run the math.
I haven't done any numerical modeling, but, you know, it's fascinating.
So there's so many different dimensions to play with.
But at the same point, they only have a limited amount of time to do the project.
And I told them, well, you know, measure.
Think about what's using up the most power, measure, and then decide if you actually want to do the optimization.
So in the end, the starter code ran for six seconds or so on the properly charged UltraCap.
And the top students were able to get the system to run for 40 minutes on that same charge.
That's good. That's really good.
Yeah, I love it.
And that's one of the things that I love about teaching classes and big classes
because I give these wide open optimization challenges and students come up
with things I wouldn't have thought of in a million years. So, you know, I learn from those
and I recycle them in the coming semesters so that everybody learns from them. So it's a blast.
It's such a different way to approach the problem than make your code run at such and such microamps as measured on a multimeter.
By making them look at the whole system, it makes it a far more industry-relevant problem.
I'm glad to hear that.
Seriously.
And it helps with the power on and the sleep and you can say an average uh
current reading but that doesn't mean you have to take into account things like how the voltage
changes and how that affects your led visibility and i i should i should also say that um in this
project i showed them i i put a 10ohm resistor in series with the positive supply rail.
I used my Digilent Analog Discovery 2 test equipment, and I used the differential oscilloscope probes to look at the voltage on either end of that resistor.
And that voltage is proportional to the amount of current
that the system is drawing.
And I put that up on the projector and showed the students,
okay, here's how we can measure how long the processor is drawing,
how long the system is drawing current,
and how much current it's drawing.
And, oh, we can superimpose the I squared C communication signals here.
So we can see when the bus transactions are.
We can see when the LED turns on.
And that gives us a real insight into how the system is using power. And that's really critical if you want to efficiently optimize it. If you go far enough, you can break somebody's
AES key with that. Yes, you can. I suppose so. I don't have time for that kind of thing.
You said it initially worked for six seconds.
And the part of my brain that is always in consulting, how can I help people mode, immediately said, okay, so it would probably take not very long to get to about 6x of that.
And then after that, the efficiencies start to get a lot more expensive in terms of time. And you have to find improvements in the beginning and then you just tapered off and you spent longer and longer getting one nano amp at a time? the type of person to publish academic pedagogy papers, things like that.
I'd actually have my students track the performance
and how much development time it took to get to that performance level
and then do analyses and publish that.
But for me, that's not nearly as exciting as actually doing the optimizations.
So I don't do that.
But anyhow, the students were given some guidance such as, okay, disconnect the debugger by cutting these three traces or cutting these two traces on the board because the debugger debug microcontroller consumes a lot of power.
You'll also want to disconnect this.
Well, you'll want to enable the blocking diode on the output of the voltage regulator
because if you don't enable that thing, then current will flow backwards through the voltage
regulator and, you know, bad things happen there. You lose power. So some of the students flailed
around, but I gave them information. I gave them an approach into the performance, sorry, the power consumption
breakdown for the processor and the various peripherals. It wasn't a terribly complex system.
So that really helped them make progress. That's understandable. Do you have any other
projects like this lined up or other ones that you've done in the past?
Oh, man, tons.
Tons.
Not just energy optimization.
Power optimization.
Sorry, not just energy optimization.
Runtime optimization is a big one.
You know, make this code run as fast as possible.
The class, so the embedded system optimization class has a project in energy optimization, one in speed optimization, one in memory size optimization, and another in responsiveness optimization.
So the specific projects change from semester to semester, but they follow the same rules.
This fall, I think we're going to be using spy communications with a micro SD card,
and we're going to look at optimizing that. That's such a pain.
Yeah, but the students really understand it. And actually, that's one thing I also want to point out, talking about the scheduling. The students coming out of the classes are understanding the
scheduling well, because I start off with an I, well, this past semester we had an I2C driver code that they had to write.
Actually, they started with a blocking I2C driver implementation where the code sends a byte, like I guess the address byte, to the I2C peripheral, and then it waits.
It blocks busy waiting for the transmission to complete,
and then it sends the next byte.
And anyhow, it's using busy waiting for all of those.
So I have the students put twiddle bits on the program
so that it indicates what it's doing when,
and they look at it with a logic analyzer. on the program so that it indicates what it's doing when,
and they look at it with a logic analyzer.
And then they convert that busy wait code to a finite state machine-based approach.
And then they move the finite state machine into an ISR.
And they realize, boy, I can free up a lot of processor time.
And then we move to using a real-time kernel, RTX.
And there, instead of the finite state machine,
we have them use an interrupt service routine
that sends a signal to the main thread saying,
oh, I sent another byte.
And then finally, we've got almost all the work done in the ISR.
And the last instance of the ISR sends a signal to the I2C thread saying,
okay, I just finished transmitting the entire message.
And they get an idea of how much overhead there is,
but more importantly, how to communicate between a thread and an ISR using signals
or semaphores or whatever.
And the brain gymnastics that go with understanding how signals or callbacks work in situations
like this.
Yeah.
Yeah.
Yeah.
Yeah. Yeah. One of the questions I get, or one of the series of emails that I get is, look at my new language. Look at Rust, or Go, or Fred, or...
Or Node.js on...
Node.js. And I'm not going to include Forth in there, because I really do believe Forth has a special place. We won't just say what it is, but it's a special place. But you are teaching in C and in assembly. Do you think these other
languages will be important in the short term in your students' careers in the five years or 15
years? Or do you think C will always win because it is the least common denominator? I'm glad you asked that
question. A student asked me a couple of years ago, why aren't you teaching this class in Rust?
Yes, that's one of them. Yeah, and I said, because you wouldn't be able to get a job if I taught this
in Rust. And that's really what it boils down to.
C, I wouldn't call it the lowest common denominator. I'd call it the solution that
solves 99% of the problems out there. So there are all of these development tools that target C.
People understand how to code in C. There are a lot of C compilers out there.
There are a lot of libraries out there in C. If you take a look at it from the point of view of
an industry project manager saying, okay, I've got my code, you know, I've got my widget here,
and I'm going to upgrade it. I'm going to add some features. Hmm. Should I continue with C or
shall I switch to a new language? All of my development team knows C. None of them know the
new language. Hmm. Okay. So in that case, it's pretty obvious you stick with what's there.
We can make the situation a little less biased. Okay. I'm a new, I'm starting, I'm kicking off a
new project. Okay. What language should I program in? Hmm, should I go with C or should I
go with Rust or should I go with Java?
The risk levels associated with those other languages
are so much higher than the ones associated with C
that it's a no-brainer to go with the language that
is supported best by industry.
What about C++?
Sure. Well, C++ too. So that fits in the category, C, C++.
But I've seen a lot more embedded C than C++ in the embedded design reviews I've done for industry. There were, let's see, there were a couple in assembly,
but the vast majority were C.
There were a few in C++, but generally the, well, yeah,
the vast majority of the embedded projects that I saw were in C.
Do you find the argument compelling, and I've heard this a lot, that
yes, we all have
great institutional momentum
or inertia behind C,
but given the security
challenges with embedded systems now
being connected to the internet,
we need to look beyond
C to languages that
protect better. That's the argument
I hear about Rust a lot,
is that, okay, this is designed to solve the security problems with C.
Does that resonate with you, or do you find that to not be compelling?
That argument may be compelling for a small subset
of applications in industry,
but it's like the Dvorak keyboard, right?
Compared to the QWERTY keyboard.
Can you buy a laptop with a Dvorak keyboard these days?
I doubt it.
But it's much more efficient
but everybody's using QWERTY.
So I can imagine that for some applications
Rust and other application-specific languages do have certain benefits.
The challenge is you have to compare them against what's already there.
What's the installed base of tools and users and expertise and understanding what limitations are and how things break?
And the cost of switching over to something new with all of these unknowns.
It is a chicken and egg problem, though.
I mean, if the colleges don't teach Rust and Go and even Ada seems to be resurging, then
we won't have anybody who knows them.
And if nobody knows them, then we will forever use the argument
that nobody knows that language.
Well, I think industry needs to...
I'm not sure which way the driving happens, right?
Oh, I think it goes both ways.
I mean, I think it's hard to hire somebody who knows Rust to join my team.
On the other hand, if I only know Rust, it's hard to get hired because it's just a smaller piece of the pie.
All right.
I'm sorry.
I was just curious about the languages.
That wasn't what we were supposed to be talking about.
No, that's fine.
I have opinions on those, too.
Why don't you program in Java? Well,
there's a good reason. We went through this, actually, our computer science department
right about the time that I arrived. They switched teaching undergrads. They switched from C to Java.
And we started having to teach C code to the students because Java doesn't work for us in ECE. It's too much abstraction and
it just doesn't work for ECE. So we've had to recreate our C curriculum or recreate a C
curriculum so that the students would learn properly. Switching away from languages,
what areas of Embedded do you think will be most interesting in the next five or 15 years?
I admit I've been on a bit of a machine learning kick, but what are you looking at for the long-term careers for your students?
Oh, boy.
Everything is going to be really interesting.
Yes.
It's this security and internet. and yes it's really all these things
i've yeah there's so many there's so much new material coming out there and systems are getting
more and more complex and to be successful you have to be able to manage that complexity
and go into detail where you need to, but be able to
stay away from details where you can avoid them. And you look at the IoT and there's a lot of
complexity in there. There are a lot of security risks. It's going to be interesting to see how that plays out. But I can't really point out any particular area as being the hot area because there's so much development going on everywhere.
The energy efficiency for these microcontrollers is amazing.
Peripherals that are getting more and more powerful and are able to operate autonomously.
So Microchip has this concept of core independent peripherals,
which are basically supercharged peripherals
that can trigger each other and share data without the CPU.
So even more powerful.
You look at Cypress's programmable system on chips where you've got
all that stuff i mean it it's amazing what's out there and i'm looking more at the infrastructure
rather than the applications uh so you know i don't have the imagination to say oh this is a
really hot area but there are going to be all sorts of things that are enabled by these fantastic technologies that are coming into the field.
I like that optimism, and I like hearing it from you, because you're somebody who is talking
to the next generation.
So thank you for that.
My pleasure.
It's a fun field to be in.
If you don't let the complexity overwhelm you, it's a really fun field to be in.
And I agree, totally.
Yeah.
But I think we are out of time for this show, even though I have many more questions to ask you. Thank you so much for joining us. Do you have any thoughts you'd like to leave us with? No, I guess not.
I'm not going to plug myself
for anything. I've done that enough.
Embedded systems are cool.
All right,
then. Our guest has been
Alex Dean, professor at
North Carolina State University
Department of Electrical and
Computer Engineering, and author
of Embedded Systems Fundamentals with ARM Cortex-M Again, thank you for being with us, Alex.
My pleasure. Thanks for the opportunity.
And I would like to thank Chris Svek for introducing me to Alex and to thank Christopher for producing and co-hosting.
Thank Digilent for their 15% off coupon that was EmbeddedFM15.
And of course, thank you for listening.
I, too, have a final thought to leave you with.
It's fairly short, I promise, I promise.
From Jackson Kittert,
think of life as a school for your soul.
You're here to learn in perfect well-being.
Here's a tip for life's pop quizzes.
Instead of asking why something happened,
ask instead, what can I learn?
Embedded is an independently produced radio show that focuses on the many aspects of engineering. It is a production of
Logical Elegance, an embedded software consulting company in California. If there are advertisements
in the show, we did not put them there and do not receive money from them. At this time,
our sponsors are
Logical Elegance and listeners like you.