Embedded - 263: Experience the Theory

Episode Date: October 4, 2018

Professor Angela Sodemann of @ASU spoke with us about new ways of teaching, robotics, and haptic displays. Angela’s robotics courses can be found at RoboGrok.com, including the parts kit. Note that... they focus on creating usable robotics as well as teaching theory so there is math, code, and hardware.  

Transcript
Discussion (0)
Starting point is 00:00:00 Hi, I'm Alicia White, and you are listening to Embedded. My co-host is Christopher White, and today we are joined by Professor Angela Soderman to talk about education, robotics, and creating haptic displays via resonant microbeam vibration. Hi, Angela. Thanks for joining us today. Thanks for having me. I'm honored to be here. Haptic displays via resonant microbeam vibration. Hi, Angela. Thanks for joining us today. Thanks for having me. I'm honored to be here. Could you tell us about yourself as though we were meeting at a technical conference and sitting down to lunch? Sure. I'm an assistant professor at Arizona State University at their polytechnic campus. And I teach robotics there and I do some research in different kinds of robotics related topics. And I'm particularly interested in
Starting point is 00:00:56 figuring out how we can do education more efficiently, meaning more effectively and with less effort to more students at a lower cost. And I want to do that through using new kinds of technology. And just a clarification, we are educating the human students, not the robots. That's right. But if the robots want to do the educating, that would be fantastic. All right. So let's do lightning round where we ask you short questions and we want short answers, although we don't have to. And if we are behaving ourselves, we won't ask how and where and when until after we finish. OK, the first one goes to me, apparently. Robot overlords or robot servants? Oh, are you asking what I think is going to be the result or what I want to be the result?
Starting point is 00:01:54 I think we all want the result not in the near future. But I think that really effective robot servants are also pretty far in the future. Research or teaching? Ooh, I would say that I enjoy both just as much. but I think that the teaching is more important, so I put more of my time into that. Robot operating system, yes or no, and if yes, kinetic or other? Oh, I'm going to have to skip that question. I don't know enough about it to say. I think that boils down to a no. Given infinite resources, infinite,
Starting point is 00:02:52 should a robot be designed for its purpose or designed as a humanoid? Ooh, I'm going to have to go with its purpose. I guess I don't have enough faith in humans to say that they should be designed as humans. Robert Heinlein, Andrew Norton, Marion Zimmer Bradley, or Isaac Asimov? Oh, Isaac Asimov. The three laws are the foundation of all excellent robotics thought. Should robots be social with humans?
Starting point is 00:03:26 They're going to have to be, whether they should or shouldn't. So I think we have to figure that out sooner rather than later. One more. Do you have a tip everyone should know? Tip everyone should know about anything in particular? About anything you want. Your ability to learn things on your own is one of the most important abilities as you're becoming an engineer. So develop that skill, I think, above all others. That would have been good advice to me like 20 years ago. Okay, so RoboGrok. That was how I first heard about you and what you've been working on.
Starting point is 00:04:10 So maybe we should start there. Sure, yeah. What is RoboGrok? Well, the name RoboGrok comes from robotics, of course. And then Grok is from a famous science fiction novel called Stranger in a Strange Land. And in that book, the word grok meant to know or understand something deeply as if you were one with it. So this is my attempt to develop a system to teach robotics to people in a way such that they will truly, deeply understand it. the theory and the mathematics of the robotics together with what I call challenges, which are hands-on activities that require the student to understand the theory in order to make the So throughout the course, the students have a kit of parts.
Starting point is 00:05:28 And in each section of the theory, they watch a video that presents the theory and also leads them through an activity that uses that theory. And the idea behind this is that this will help students to connect the abstract mathematical concepts with the physical concrete machines. And I think that making that connection is one of the most important skills for a budding engineer to develop. So that's the overall goal and process of RoboGrak. And as not a budding engineer, but perhaps an experienced one, I found robotics impenetrable in spots because it was all math that seemed unrelated to what I actually needed the robot to do. I mean, I know how to tune a PID loop, but inverse kinematics, it's just a function. And it's a function that works basically poorly because I don't know how to set up my robot exactly properly. Right. And this is a very common observation of people who are learning robotics. And I think
Starting point is 00:07:00 that the way to get over that hurdle is to do the math and build the machine at the same time. So you build the machine, you do the math for that machine, you take your math, put it into the programming, and then you try to make the machine work. If your math is right, then the machine will work correctly. And if your math is wrong, it won't work correctly. And seeing how the machine works wrong when your math is wrong tends to help you to make the connection of what's wrong with my math and why. Because it's not just happening in the math. It's happening in the machine at the same time. And I noticed with some of your lectures, the way you set up how you determine if it's correct was more useful than any way I had come up with trying to apply the math to my problems. You had grid paper that showed the robot in better ways,
Starting point is 00:08:08 and you had ways of setting it up so that you could run the math with software. Yeah. So many times, if you go out and you buy a robotics kit, the kit is designed to allow you to build a particular robot and then do things with that robot and so on. But the kit that we use with this class isn't designed to just let you build a robot. It's designed to try and illuminate what's going on under the hood with the robot. And so, for example, as you mentioned, our base plate is designed to have all of the X and Y positions marked and angles marked. And the end effector of the robot is a marker that can draws its path as you go. The purpose of the kit is not to make a robot,
Starting point is 00:09:13 but the purpose of the kit is to experience the theory of robotics as you go. Experience the theory. That's a great way to put it. What is Indicate? Well, it has a selection of mechanical components, links and servos and some different kinds of motors. There's a DC motor with an encoder on the back and a stepper motor. It has electrical components, LEDs and resistors and an inductor, motor driver, that kind of stuff. And there's also a microcontroller in the kit. And all these parts are designed to allow you to do all the different kinds of experiments. Oh, there's also a camera.
Starting point is 00:10:12 So there's also a machine vision component of this that is included. I haven't gotten there yet. I'm going to have to hurry up. It's good fun. I can't wait till you get there. The microcontroller is the Cypress Psoc 5 that's right uh and it's it's half like normal cortex arm microcontroller and half fpga uh yeah that's right well it's it's similar to the same kind of a thing as a pic chip or an arduino or anything like that, the reason why I chose that microcontroller as opposed to another one is mostly because of how it provides a good tradeoff between ease of use and ability to see what's going on under the hood in the IDE. So for example, with an
Starting point is 00:11:10 Arduino microcontroller, it's designed to be very easy to just pick up and use by someone who is a hobbyist or has never programmed a microcontroller before, but it doesn't show you really what's going on in some of the functions that you use. On the other side of the scale would be something like a PIC chip, which is very difficult for beginners to learn how to use, but it's very powerful. You can directly write bit strings to the registers and things like that. So to give you an example of this, if you want to set the angle of a servo, in order to do that on Arduino, you just use a built-in function called servo, and you write a value to servo. On the other hand, with a PSOC, with their IDE, you have to set up the PWM, the pulse width modulation signal, and get the pulses to have the right duty cycle and things
Starting point is 00:12:17 like that. So if you are a person who is an engineer or wants to become an engineer, it's beneficial to you to actually go ahead and learn how things like a PWM works so that you can use them in the future for other purposes. So I think that the PSOC provides a good balance. It's not too difficult to learn how to use and yet really shows you what's going on under the hood. What compiler do you use for it? Oh, I just use the built-in Cypress compiler. I don't use a separate one. Okay, that makes sense. And it's free?
Starting point is 00:12:57 It is. It's free. It only works on Windows, unfortunately. So if you have a Mac, you'll have to run boot camp or something like that but yeah it's free and the students in your course at asu get the kit as part of the class well they they still have to buy the parts for the kit but i sell it to them at cost so i buy all the parts myself put them in the kit and then they buy it at cost from me. And if I wanted to buy the kit, you have it on your website on Robocrock.
Starting point is 00:13:32 That's right. Yeah. I didn't seem... Oh, I see. For ASU, it's $95. And for general public, it's $395. Right. That's right. I see. What does the extra $300 buy me? Oh my goodness. It buys you the access to all of the lectures. So you don't have to pay for those separately. But of course, putting those together took a lot of work and effort in designing the kits, so I can't sell the kits at cost to the general public. That's fair. And it should be pointed out that the ASU students are paying for their tuition, so it's not an apples- apples comparison here. Right. They're not getting the lectures for free either. They're just paying for them through another channel.
Starting point is 00:14:31 But the lectures are free if you don't get a kit. That's right. If you just want to watch the lectures and see what you could do with the kit, you're absolutely welcome to do that. And I think many people who are in another robotics class at another university can benefit from watching, you know, how Jacobian matrices are derived and things like that. I mean, it's been really helpful for me. I haven't gotten a kit yet. But the Denevitt-Hartenberg setup, I was totally not getting in another online class. And yet, the way you did it, it made sense. And I could sketch my robot arm and try to figure out what its things were and realize that there was an X and Y and not just an X.
Starting point is 00:15:22 And yeah, it was really helpful because I could see you doing the applied and the theory back to back. I mean, not always exactly the same time. You have to go over the theory and then you can apply it. But it was very good that it was both. It almost seems like two classes, though. I mean, you have all the theory and then you have like a programming class or an electronics class. of other classes that I assume students have also had. So I make some assumptions about students having already learned something about programming at some time, even though they may not be proficient in the C language or also Python we use in the class.
Starting point is 00:16:20 So it doesn't start from an absolute zero position. There's some assumption of pre-knowledge that's already there. That seems good. Sometimes there are beginner tutorials and beginner classes online, and then you're supposed to apply that magically to far harder problems. So sometimes you should just accept people need things or already have information. Yeah, I've thought about creating, you know, a robotics zero class or something like that, that would give some more of the knowledge that I'm assuming students have maybe sometime in the future, I'll finally get some time to do that. Is this considered a flipped classroom? Um, yes, but it's maybe not just a standard flipped classroom. So the idea behind the flipped classroom is that when you go into the classroom and you listen to a lecture
Starting point is 00:17:26 given by the professor, that's not really the time that you need the professor to be there. Some students ask questions, but most of the time students just kind of listen and take notes. So in that case, there's no reason that you can't have the lecture be a video that the students watch outside of class. Then at the time when students are trying to apply what they've learned, that's when you really want the professor to be there because that's when you're getting stuck. So that is sort of what we do here. The students in my class watch the videos outside of class and try to do the hands-on part as the videos go. And then the students who are stuck come into class and our class time is devoted to me going around and solving one problem after another and answering individual questions. So in that sense, it is a flipped class, although the hands-on component isn't standard to a flipped class.
Starting point is 00:18:30 Is the flipped classroom like this a way to get out of extended office hours? Oh, no. So, sorry, I didn't have any flipped classes. And as you were talking, I'm like, well, you go and you see the professor and you camp outside his office and finish your homework there. case, this ends up having many more office hours than we would have otherwise because, of course, students are trying to solve real problems. They're not just trying to punch things into their calculator. So it ends up being much more in-person than otherwise. Because the kit almost acts as a lab would. I mean, they're touching things, they're holding things, they're moving them around. Right. It works the way a lab would. And the difference that I think is an important difference is that it's a lab that you take home with you and you keep in
Starting point is 00:19:38 perpetuity. So you can in the future say, let me use these parts that I have to prototype this other idea that I have. So it's a lab, but it's yours. You get to keep it. Yeah. They never, ever let me keep like the spectrometers. Right. Right. They don't let you take the many thousand dollar oscilloscopes home with you, unfortunately.
Starting point is 00:20:07 What about people who take the class online? Do they get any interaction with you? Well, not in person. So there are message boards on each page of my website and people who are not physically in my class leave messages there and I answer them. And there's email communication. But that's pretty much all we have right now. I would love to figure out some way and maybe virtual reality will eventually fill in these kinds of gaps, have a way to have in-person interaction with students who are not physically in the class. But I think that we're maybe not technologically there quite yet. Could we simulate using the kit? That would be amazing. I think that, so I recently bought for myself an Oculus Go, which is the lower cost Oculus VR headset that just came out a couple months ago.
Starting point is 00:21:19 And it's amazing. I think that if we could come up with a way to have virtual reality machines that you could interact with at the same time that you're doing the math, maybe that would be probably not just as good as having the actual machine there, but certainly better than having the math with nothing else there. And, you know, lower cost than having to buy all these parts and put them together and things like that. So I think that's an amazing potential for the future of virtual reality. I totally agree. I'm excited about virtual reality and education. The idea of the different things I've learned in math. I mean, plotting something in 3D on a 2D piece of the information you're giving me. If I could just walk through it and see it from all the angles, I would grok it better. Yes. And this is true, especially in robotics, because so much of what we have to do involves imagining how this robot will look if you looked at it from the top or if you looked at it from the side as you know this joint rotates but this joint does not rotate and trying to
Starting point is 00:22:55 picture that in your head and then draw it on a piece of paper is not a trivial leap to make for someone who's trying to learn robotics. And maybe virtual reality will make that so much easier to figure out. I'm so glad to hear you say that. I must have drawn my robot like 97,000 times at this point. And I still can't get just the right angle to make the everything work out. So you put your classes on YouTube. I mean, they're all off of your RoboGrok.com site, but you also, their YouTube list. Right. The, my videos on my website are just embedded from YouTube. And that's very good for being a platform where new students can discover this. Many students are on YouTube searching for solutions to problems that they
Starting point is 00:23:57 can't solve in their own robotics class, and then they can discover my videos there. That worked for me. And then does the site offer more things? Should they be watching it on YouTube or if somebody wants to go see it, should they go to your site? Well, my site has a better organization of the videos. In YouTube, really all you can do is make a playlist that goes from one video to the next. My website organizes everything into two different classes, and it organizes them in a kind of map that draws you from one topic to the other. It has quizzes that you can look at the questions and and uh some of them are multiple choice you know some of them are calculations you can look at the exams that the students took in previous years so
Starting point is 00:24:56 there's some things on the website that aren't on youtube cool good to know You also have a Udemy course. Is that right? Yeah, I put the videos up on Udemy also because I think that sometimes students go to sites like Udemy when they're specifically looking for a whole contained class. And those who are on YouTube looking for things are many times not looking for a whole contained class. They're just looking for a solution to a particular problem that they have. So I have posted my courses other places to try and reach different audiences. Has that been very successful? Somewhat. um somewhat um my youtube subscribers are going up and some students are taking the udemy course but um i don't actually put much time into it right now since my full-time job is actually
Starting point is 00:25:55 teaching at the university so it's hard to uh it is isn't it? So what does the structure of the course look like? If I decide, okay, I want to take this online, or if I'm an ASU student and I'm looking at the course, what can they expect to kind of go through? And is it really structured, or is there, like, you choose a project and you work toward that kind of on your own with guidance?
Starting point is 00:26:26 It's very structured, but you do work towards a project, except the project is defined. So many times when you take a class that has a project at the end, that project is left open-ended. So come up with something you could do with these parts maybe and build that. But in my course, all of the students at the end of the semester make the same project. And the project that they're making is a three degree of freedom SCARA manipulator that does a pick and place operation using a little electromagnet, an inductor, to pick up a ferromagnetic object and then drop it in another location. And they have to use a camera looking down at the workspace of the robot to automatically find the location of the object that's been placed there. So throughout the course of the semester, each of the topics that you go through teaches you one part of what you need to know in order to do that final project.
Starting point is 00:27:36 And then you will use the parts to practice that one little piece and then move on to the next little piece and so on until at the end, you can do this one big project and make that thing work and i mean i yes it all goes towards one project but you also asked about structure and yeah they're in the first one uh there's the kinematic section which goes through all the kinematic stuff i've ever seen before homogenous transformation matrices the done ofovit-Hartenberg, the rotation matrices, all these things that end up with, here's kinematics and here's inverse kinematics, yay, position. But then it goes to sensors and motion control.
Starting point is 00:28:20 So you do get the whole structured piece. And under motion control, she's got PID control and UR communication. I haven't gotten there yet, but the kinematics I've enjoyed a lot. Actually, when we were emailing Angela, we were talking about the difficulties of online education. And structure was one of them. What are the other hurdles you see with other forms of technical engineering-ish education online? Well, besides structure, I think the next hurdle is the depth. So in some of these courses that you find online, they're targeted towards the person who wants to, you know, build a working robot in five minutes or three easy steps or whatever.
Starting point is 00:29:23 And it's not targeted towards the person who wants to grok robotics and so if you want a real university level depth of knowledge about something you need to you need to find an online course that's that's really got some meat to it. And I think that that's hard to find. Many are sort of tutorial kinds of things. The third hurdle that I think is a big one, and I don't have any ideas about how we'll eventually solve this if we do, has to do with certification. So you asked something previously about what's the difference between those who are taking the course at ASU and those who are doing it on their own somewhere. And certification is a big difference here. If you take the course on your own, you'll end up with the knowledge, but you won't end up with a degree in the end. And I don't know if we can figure out some way to certify students who have gone through something like a good Udemy course or something like that to say that these students really have learned this stuff. That's an ongoing hurdle in online education. Udacity has been working on that problem some.
Starting point is 00:30:54 Yeah, and edX, I think, is heading down that path as well. It is tough. With Udacity, they end up having to charge quite a bit more so that they can hire people to grade projects. Right. Yeah. Yeah. Certification itself is not cheap. And it is a path that a lot of people need. I thought about going to grad school, I guess it was last year I was thinking about it. And because I love robotics and I love machine learning and I want to get more into it and sorting through the math and mental gymnastics on my own was hard. So
Starting point is 00:31:32 I couldn't find things with the right depth or the right structure. And so I was like, okay, this is what grad school's for. But I have been working for a long time and the hoops they wanted me to jump through seemed tedious. I don't really need a course in introductory writing or whatever nonsense. I just want to learn about the robotics. Right. But if I worked for a company and I wanted them to pay for my education, I would have to be through certification method. Right. And the kinds of hurdles you have to go through involve things like physically showing up at the university. So you have to live somewhere near the university or you have to commute and the schedule has to work out with your own schedule and things like that really
Starting point is 00:32:26 could be solved through online education where you do it on your own schedule and you do it in your own you know home or office if we could if we could solve the the certification and structure problems, that could be a great boon for engineering education in the future. I mean, I remember taking video classes in a grad, actually at Stanford, many years ago, and then we would go in for exams. And that worked out fine. We all said we were going to class every week, but we never, we just all went to the video place. But it wasn't quite as good watching the video as it was going to the lecture for some reason. Do you have other classes where you do the lecture, the non-flipped classroom, the traditional classroom? I used to.
Starting point is 00:33:29 So I don't anymore. But way back when I was first developing this approach, I didn't have any videos. I did it all in class. And the reason why I went away from that was logistical. So imagine you go to class and you've got all of these parts and the professor is giving a lecture and the lecture includes theory. And then it says, you know, now build this thing and try this. What happens is if there's any appreciable number of students in the class, one person gets stuck on part A and another person gets stuck on part B and so on. And the professor ends up having to go around the class and solve each student's problem and nobody can move on
Starting point is 00:34:18 until everybody's problems have been solved. And I just couldn't make it work. So the reason for switching to the video was actually logistical to allow students to move forward at their own pace. So there's kind of a trade-off here. And I realized that you lose something when it's not in live in person and you're doing it on video but you also gain something also which is that you can do it on your own time and at your own pace i think it's a good trade-off but that said in my class i still do an in-person review at the end of each of these sections to just work problems on the board live with the students. And I'm not going to be stopping that anytime soon. It's valuable. So I'm not really sure how to ultimately solve the problem to get the optimal trade-off here. Well, when I was taking videos,
Starting point is 00:35:17 it was more than 20 years ago. And it was just dark videos. And sometimes they would forget to switch to the professor's slides. So it was just not good yet. Oh, yeah. Badly made videos can ruin it in any case. And it was before the Andrew Ng course with the machine learning course he did, which was, I think, kind of the first one I saw that it was good as online lectures because some of that material I could speed up and some of it I needed to go through three or four times before I really got it. Right.
Starting point is 00:35:58 And that's one of the great things about online education. Absolutely. Yeah. Faster video at 1.5x is my friend. Right. In addition to the ability to skip ahead and skip back, you can play it at different speeds and everyone kind of has their own speed. And that's, I think that's beneficial. I think it's beneficial for both the people who want to go faster and the people who want to go slower because you get to go at your speed. Absolutely. That's important.
Starting point is 00:36:38 And yet, for all the greatness that has come about, I mean, the lectures are better, the speed is better. The online education, you said it had a huge dropout rate, which I totally see. I start classes and don't finish them. Why do you think there is so much dropout in online education? Well, I think it has to do with two factors. And I don't know that I have the data to support this. This is just my observations. I think one reason has to do with engagement or interaction. If you are in the class a passive participant in the class.
Starting point is 00:37:25 You're just listening and watching. It's easy to fall away from that and decide to stop it. But if you are an active participant in the class, which in person that might mean asking questions in class, that might mean going up and talking to the professor, or even simply the act of going to the classroom. That seems to hold people better. When you're taking it online, there isn't much of that interaction, but I think there could be. I think that in a class like mine, where you are actually building things and making things work as you go, that that could potentially help solve that problem by forcing the engagement and interaction of the student. So I think that the second reason is that you have to have the sense as you're going through the class that you're making progress.
Starting point is 00:38:28 You have to sense that I know more today than I knew yesterday or what I've just learned is useful for something. And I think that my hands-on, you know, challenge approach also addresses that problem that, hey, I can now make a particular robot that I couldn't make before. And that's very attractive, particularly to engineering type of people. I can see that. It definitely helps to be able to say, okay, I know more today than I did yesterday, and I can use it. Right. And I know that a lot of people join the online forums, and that helps them be integrated more into the class to feel like there are others in their position, people they can talk to. Yeah, that's a type of engagement and interaction, right? There is no pressure
Starting point is 00:39:27 to complete the online education. That's one of the things Udacity has been trying to do. You have to finish it within six months or pay for it again. Do you think you might do something like that? Do you think you might charge more for it so that people feel the pressure to finish? Oh, you know, I think that's another case where there are pros and cons to the two. You don't want students to be able to easily drop out for no reason, but real life events happen to people as they go through a course and you want to be able to work around real life events so i i'm not i'm not sure about whether that's a good thing or a bad thing you want to have some pressure but not too much pressure people do tend to value the things they pay for. That's definitely true. Yeah. And I think that if you're getting a free online course, you're less likely to finish it than if you're paying for it.
Starting point is 00:40:33 Maybe I could do a study where I compare those who are actually paying for a course than not. I bet that would come out to be true. Since I really don't want you to make this a pay for a course, because I really don't want you to, I don't know why I was headed that direction. I think the only part that I would want to pay for and have pressure on is if I could have quizzes and exams. And I guess that goes back to the certification. Right. Well, on my online course, if you have bought the kit, it comes with an ID code that you can use to take a quiz that goes with each section of the class. And then you can keep track of your own grades. There's a grade display page that you can do that. So I've introduced a little bit of that so far. Where do you think this will go? Do you think this is what education will become?
Starting point is 00:41:39 This is what I wish education would become, but I don't know that I have the confidence to say that education will become this. I think that the universities don't have the, they care a lot about research and maybe not as much about the advancing the teaching mission. And so I think that this is the way that independent, not university, online education might go. I don't think that the universities will end up pushing this direction, although nothing would make me happier than to be wrong about that. But many universities, including ASU, have been at least allowing teachers to do this. It is almost fair for them to say, look, we pay you to do lectures. You can't put the lectures up for free. And yet, Coursera and your course, and I know the math professors we know do flipped classrooms and we can see
Starting point is 00:42:46 their lectures online. They may not be encouraging it, but they are at least allowing it. The video and online lecture aspect, I think we are going in that direction more and more, but I think that the hands-on aspect is less likely to be incorporated. And the reason is because it's a lot of work to design a course that goes along with hands-on stuff and then, you know, buy and then maintain all of that equipment. It's really difficult, takes a lot of effort. So I think that the video aspect is definitely going to be incorporated, but maybe not the hands an apartment and you would stay with everybody else like in a dorm. And that would be how you do your labs. And so there would be some engineering house in Aptos, where we live.
Starting point is 00:44:06 And it would be an apartment full of engineering students who worked on labs together and studied together and took online classes together. And somehow the colleges of the future would be essentially dorms located wherever it was convenient. Because the best part of college was the dorms located wherever it was convenient. Because the best part of college was the dorms. That would be amazing. If someone started that, that would make me so happy. Because it is hard to do labs. And you're hands-on.
Starting point is 00:44:42 It is kind of a lab in many ways. It's just a lab that's very well integrated into a class. Right. But there are other labs that are just, it's too hard to do it yourself. As you mentioned, multi-thousand dollar oscilloscopes, you have to be somewhere. It's too hard as a beginner to have the expensive equipment with you all the time. Right. But of course, now your dorm not only has to have the lab with the students there, it also has to have a lab manager who's, you know, fixing the oscilloscopes when they break and
Starting point is 00:45:22 teaching the students how to use them and things like that. And that's not too far away from the university as it is. Okay, I could see that. So you mentioned research. So I want to talk more about your research. Haptic display. Haptic is for touch. Display is for vision. What is haptic display? Please don't tell me you're putting anything on my eyeballs.
Starting point is 00:46:02 No, no. Thankfully, what we're trying to do is take an image. An image is a thing that you would normally perceive through vision, but some people can't perceive images through vision. They're visually impaired. So if we could take an image and display it in such a way that you could feel it, let's say with your hand, instead of seeing it with your eyes, that kind of a device would have many different kinds of applications. But the application that we're most interested in is for people who are visually impaired. For them to cross the street, for them to read, for them to... I mean, there are many different problems for the visually impaired. Do you have particular ones you're trying to solve
Starting point is 00:46:45 that would impact the resolution of your needed display? Right. If we can come up with a display that has a resolution that's high enough that it's at least in the ballpark equivalent to a visual resolution. Then we can use it to display all kinds of images. It doesn't necessarily have to be limited to text or seeing whether the light is red or green at the stoplight or something like that, those who are not visually impaired get a huge amount of information about our world through what we see. And if we could just in some way display that image to the visually impaired, they could also get a huge amount of information from that display. So that's what we're trying to achieve. With so much machine learning and computer vision,
Starting point is 00:47:50 we do get a lot of information through our eyes, but we get a lot of noise through our eyes too. Should we be trying to show them pictures, videos of the world tactilely? Or should we maybe limit that so that it's only the important stuff where important is determined magically? Yeah. It's true that there's a huge amount of noise that we get through the images, through our eyes, but our brain is very good at figuring out what information is useful and what information is not. And I think it's important to keep in mind that for most visually impaired people, what's going wrong is not their brain, it's their eyes. So their brain, in many cases, still has the capability to filter through information and understand context and all these kinds of things that are very difficult still at this point for artificial intelligence to handle. We aren't anywhere near having an artificial intelligence that's as good as the human brain at understanding an image.
Starting point is 00:49:08 So if we can use the fact that the visually impaired person's brain is still fine and just give them the visual information, it might end up working really well. And I think this is possible. I read recently about the neuroplasticity and how MRIs, no, the other one, the brain scans will show the visual part of a visually impaired person's brain activate when they're asked to think about how they move around their house or things we usually do with our vision spatial awareness is still there and it still highlights the same places right and there's fmri fmri i was missing a letter yeah there's been a huge amount of research into brain plasticity, and the results are promising for an application like this, that the brain actually would be able to perceive a visual image presented through a different sense.
Starting point is 00:50:16 Okay, so you want high-resolution information along the same as our eyes, which is a lot of data. How are you going to do that? would show a sighted person an image that is a much lower resolution than what their eyes can actually see, say 64 by 64. You can still tell what the image is showing, even though it's much lower resolution than what your eye can see. So we're just trying to present in our haptic display an image that's something like 64 by 64 or, you know, 640 by 480 or something like that, that we can visually see. But that is the big challenge here. Many of these haptic displays are made using vibration motors, like the little motors that are in your cell phone that make it vibrate when you get a call. And those vibration motors are very small compared to other motors. But imagine you tried to make an array of those things. It's like 64 by 64.
Starting point is 00:51:42 It would be huge. It would be heavy. It would consume a lot of power. It would be hard to control it with a microcontroller. So we need to figure out a way to make each of these tactile elements that vibrates very small and make it so that we can control many of them with not very many pins of a micro controller. Okay, so I have this idea of those toys you sometimes see that are hundreds of little needle-like objects and you put your hand in and it takes a 3D image of your hand that you can see on the other side. Do you know what I'm talking about?
Starting point is 00:52:33 Yes. Yeah. That's a great place to start for understanding these devices. That kind of a device has one really big drawback to it, which relates to a detail about how our skin senses something that's touching it. We have different nerves for what's known as static touch and dynamic touch. Static touch is when you touch something and that thing is not moving past your skin. And dynamic touch is when you run your skin past something and feel it moving past your skin. As it turns out, human dynamic touch has a much higher sensitivity or resolution than static touch does. So if we're going to make one of these devices with the idea that it's going to be high resolution, teeny tiny things very close together, we're going to have to make them
Starting point is 00:53:41 move past your skin in order to increase your ability to feel them. So like a movie? Well, if you think about the way a person who is blind reads Braille, you know that they slide their finger over the little bumps. And one of the reasons that they do that is because it enhances the sensitivity. So imagine you had one of those bed of needles toys, but you could make the little needles vibrate so that when you touched it, they actually kind of moved past your skin. That would, believe it or not, that would make it so that you can feel what's going on with those
Starting point is 00:54:34 needles much better than if you just put your hand on it and they're static. So now I'm envisioning like waves of needles. Close, yeah. Except instead of moving them up and down, think of them moving left and right so they're vibrating. All right, that's cool. How well does it work well so the way we get these things to uh vibrate is by playing sound and we have each of these little needles we design it so that it has a different natural frequency by making them different lengths or different cross-sectional areas. We know that we are able to make the different needles vibrate independently of each other with the sound, and that works pretty well. But we don't know yet how well the person can feel those little vibrations because we've just gotten to that part of the study.
Starting point is 00:55:44 We know that if you put your hand at just the right distance, you can feel them pretty well. But if you put your hand down a little too far, you'll make them stop vibrating and then you can't feel them anymore. Or if your hand's a little too far away, it doesn't make contact and you can't feel them. So we're trying right now to figure out how to get your hand at the right distance so that you can feel feel them. So we're trying right now to figure out how to get your hand at the right distance so that you can feel all the little needles vibrating when they vibrate. I'm torn between asking what different musics sound like and asking something far more useful. Like, does it have to be your hand? It does not have to be your hand, but different parts of your skin and different parts of your body have different densities of touch receptors.
Starting point is 00:56:35 And so the hand actually has a pretty high density of touch receptors, and that's why I use the hand as an example. If we wanted to make a really big one and put it on your lower back or something like that, you could. But your lower back has a lower density of touch receptors. Maybe you wouldn't feel it as well. Yeah. I mean, your hands are just so critically important that you don't use your hands to see because you see your hands a lot. Yeah, right. because you see your hands a lot. Right. But yeah, that would have to be a secondary shift. After it works, you can figure out if you can put it on the backs of people's calves,
Starting point is 00:57:18 put the batteries in their shoes, and off we go. Yeah, and many of these studies have not been done because we haven't previously had a device that had teeny tiny, you know, vibrating things that we could use to do these kinds of studies. So one of the exciting things is actually learning more about human touch by using the device. What have you learned? uh well so far all we've learned is that uh it's pretty hard to figure out how to get a the human to be able to feel them because that if you're too far you can't feel them if you're too close you stop them right so we need to figure out some kind of an interface in between that will allow the person to to feel it kind of like a glove so that they're in the right position all the time? Oh, that's possible. What we're trying right now is putting a little
Starting point is 00:58:13 piece of rubber on the end of each of those little needles so that once you get down and you touch it, you can feel the rubber, but the rubber is kind of squishy. So you don't really stop the needles vibrating when you're touching the rubber. So we're trying different things like that. When you talk about tiny needles, I'm of course thinking about normal size needles. How big is tiny? Oh, we're shooting for about 100 microns, which is about the diameter of a human hair. That seems small and prone to bendiness. Yes, but it depends on the length. So if we had one of those that was very long, it would be very bendy. And if we had one of those that was very short, it would be not as bendy. So we just have to pick a material and a length and an interface so that you actually don't go ahead and bend them.
Starting point is 00:59:21 But they do have to be kind of bendy because, remember, what we're trying to do is make them vibrate and they vibrate by bending. So a little bit of bendiness is actually necessary. This still seems like a very difficult manufacturing problem. It's a very difficult manufacturing problem. So some of the approaches we've taken have included wire EDM. Wire EDM is this method where you put a charge through a wire, and then when the wire gets close to a piece of metal that is submerged under a fluid, the electrons dissolve the little bits of the metal. And so you can cut it almost like a cheese cutter works, cutting through the metal. And if you cut channels in the metal one way and
Starting point is 01:00:14 then rotate the metal 90 degrees and cut channels again, you would be left with little tiny square cross-section pins sticking up off the base. So that's one way to make these little things. You can also use 3D metal printing, like sintering, to make the little pins. And we're also investigating wire drawing. So we actually draw a wire to a particular diameter, and then we have a machine that cuts the wire into particular lengths, and then we assemble these things into pins that stick up off of a base that way. But yes, manufacturing is one of the biggest challenges of the project at this point. How do you attach them to the base? Well, in most of our manufacturing methods, we are either 3D printing them off the base or we're cutting them into a block. In those cases, we don't have to worry about attaching them. In the wire drawing case, we cut, we drill holes into a base and we're doing this using
Starting point is 01:01:29 like micro drilling, a teeny tiny drill bit. Yeah. I mean, if you're going to put little teeny tiny things in it, it can't be very big or they just flop over. Yeah. So we drill these little tiny holes and then the wire, we bend the end of the wire so it has a little hook and then we drop the wire through the little hole. And then we put something like a resin on the back that holds the pins in place. That seems hard. Okay. Now that my brain is full of teeny tiny vibrations and drill bits, I have some questions about my robot. Okay. It's the $50 Mi Arm, and it is not exactly repeatable in its movements, largely due to mechanical slop and super cheap things. It doesn't have any encoders. So, I also have a kick-butt processing system. I have a TX2, the NVIDIA system.
Starting point is 01:02:48 I want to be able to use my camera to make up for the deficiencies, the general mechanical instability of the Mi Arm. Is this going to be possible? I'm going to say yes. And I hope you do this and tell me how well it works. So yeah, all you need is some kind of a sensor that can reliably tell you what's happening at the end effector. I think your biggest problem here is going to be that you have a bunch of non-linearities between how your joints are moving and what your camera picks up as its positions. And that's going to make it hard to
Starting point is 01:03:35 figure out your control gains and things like that. But I don't see any reason this would not be possible. That sounds like a great idea. I think you should try it. Oh, I've tried it some. With not great success, but that's okay. Oh, do you know why? Well, my camera, so I made the Mi Arm into a little robotic cat-like thing. It would chase a laser. And that worked really well. And the camera would identify the laser location and send the arm to it. But it wouldn't look at where the arm ended up. As I try to make a typing robot, I put a blue sticker on the yellow arm and try to find where the arm is in the camera. And the results are not as consistently good. The blue sticker is very dependent on light and the camera has its
Starting point is 01:04:37 resolution and the arm has its resolution. And one of my problems so far is my camera is 2d and my arm is 3d and i don't know if that is causing some of my problems because i i don't know if i'm off because i'm down a little or i'm off because i'm over a little yeah is your camera attached to the end effector of the robot or is it attached to the base? It is attached to a tripod. Thank you, Christopher, for that hand motion. It's attached to a tripod overlooking the whole system. Okay. All right. So that would take out some of your nonlinearities, but it would not directly give you feedback on the position of the robot. Have you considered putting the camera on the robot's end effector? Did I mention it's a $50 robot arm?
Starting point is 01:05:45 This thing can barely lift a pencil. Oh, okay. But I could put another camera in the end effector space, which is to say if I wanted to type, I could put the end effector so that it looked over the keyboard more closely instead of over the whole system. It would only be able to see the end effector bit that is typing. And that might help. Okay.
Starting point is 01:06:16 I don't know. Yeah. Okay. Well, I mean, basically, I just wanted to know if I was chasing a laser like a cat, because right now I'm lost in the world of, you know, maybe I should figure out the math behind this before I just keep trying to optimize blue versus yellow. Yeah. So, yeah, that's, it's been fun.
Starting point is 01:06:42 And you think that the, well, the end effector, yeah, if the end effector had a camera, it would be a lot easier. Because then it could just go to wherever it wanted to. And if it wasn't there, it would go a little bit more, a little bit less. Right. But I can't put it there. I can put cameras not attached to the arm. Can the camera detect the location of the end effector? Can you put a little colored dot?
Starting point is 01:07:09 That would be... Well, the overhead one can't because the end effector is going down to type. But if I had a second camera, I could look at where the end effector is. But again, it would only be 2D. Yep. Can you have two cameras 90 degrees apart from each other
Starting point is 01:07:29 yeah i can what would what math do i need to make those images go together is that homography with that what are the words I should search for? I think what you should search for is coordinate transformation. Because you just need to know where the positions are in different frames. Okay, yeah, no, I don't have to join the vision part. I just have to join the after vision part. Yeah. Although my cameras move with respect to each other. Oh, they do but i i can calibrate them at the start that's what i do now yeah mostly i wanted a cheap system and a powerful intelligence to make it all work out because i
Starting point is 01:08:20 figure that's kind of what humans are. Yeah, that's a very interesting problem to me, how to take a not very capable system and make it better through its smarts. That's fantastic. Thank you. Well, with that encouragement, I think I'm out of questions. Angela, do you have any thoughts you'd like to leave us with? Thanks so much for having me. And it's been a great time. I wish you best of luck on your cool projects that you've got going on. And same to all the makers out there. Keep making.
Starting point is 01:08:57 Thank you. Thank you for being here. We've really appreciated speaking with you. Yeah, thank you. I had a great time. and thanks for inviting me. Our guest has been Angela Soderman, professor of engineering at Arizona State University and creator of RoboGrok.
Starting point is 01:09:16 That's robogrok.com, and of course, there'll be a link in the show notes. We'd like to thank our Patreon supporters for Angela's microphone, and Christopher for producing and co-hosting. And of course, thank you for listening. You can always contact us at show at embedded.fm or hit the contact link on embedded.fm while you're checking out those show notes. And now a quote to leave you with this one from Isaac Asimov. Self-education is, I firmly believe, the only kind of education there is. Embedded is an independently produced radio show that focuses on the many aspects of
Starting point is 01:09:57 engineering. It is a production of Logical Elegance, an embedded software consulting company in California. If there are advertisements in the show, we did not put them there and do not receive money from them. At this time, our sponsors are Logical Elegance and listeners like you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.