Episode Transcript
[00:00:02] Speaker A: Hello and welcome to the rubber Duck Dev show. I'm Kresten.
[00:00:06] Speaker B: And I'm Coda.
[00:00:08] Speaker A: And today we are going to talk about exploring robotics. But before we get into that discussion, normally what we do is discuss what transpired over the last week. However, in the case of the last episode, it's been quite a while due to the break. So of course, wanted to say happy new year to everyone. I hope you had a great holiday season. And in terms of what transpired over the break, one thing you'll notice, well, you should see a fair bit of changes. One is Chris is not present now, and he's just become so inundated with work, he said he couldn't dedicate to be present for every single episode. So he may be on and off the podcast at this time, but like I've mentioned to others, the show must go on. So I continue to go ahead and produce episodes.
So that's one thing. The other thing is that we're trying out a new recording platform, so hopefully the sound and the video quality will be better as well. As of course, changed. Hopefully we're actually doing post editing of the content, so we'll kind of see how it works. I don't quite know how it will end up, but I hope it'll look better than some of our previous attempts and we can balance sound and things of that nature. So hoping that that will go well, apart from what else has been going on over the holiday season, well, I got Covid, so that was exciting, and my whole family got it, so that was quadruplally exciting.
But apart from that, I am still working diligently on finishing my scaling postgres course. So that's something I'm working on. I'll have more information about that in the coming weeks, but that's pretty much my update and I think I've done enough talking. So, Coda, what have you been working on the last week or so?
[00:02:13] Speaker B: Yeah, so actually there have been a few things.
So for those of you, I've been on this for, I think, show a couple of times, but for those of you who don't know, I work in robotics and I started a startup a while back. And so a lot of times I think, kind of about what the new technologies are in the world. So I've been doing a little bit of exploring basically for our company, looking at sort of the next generation AI tooling and how that really impacts our development processes.
Kind of embarrassingly, actually. My knowledge about the AI tools out there was before pretty limited and probably still is a little bit.
And everyone knows about GitHub, copilot and chat, GPT and things like that. But I've been looking at leveraging tools for making prettier presentations and for things like doing some of the front end development work for us. So there's all these tines with Figma and builder IO that kind of automatically generate react code for you using these AI systems. So that's been pretty interesting.
So I was given a task by the CEO or the chairman of our company to go to Korea to inspire the developers there. So I figured this as good a topic as know. Yeah, actually, I am excited about this topic, so I think it'll be fun.
And then I think I've been kind of going back to way back when, maybe about 15 years ago or so, 20 years, almost 20 years ago, I started getting a little into 3d modeling kinds of things, and I kind of took a step away from it. I mean, I use it for work, or I have been using it for work a lot, but just never really for fun.
But this weekend I've been making this little train.
I don't know if.
Let's see if I can find.
This is the train it's based off of. So this is in Tokyo. This is kind of the main loop line in the middle of Tokyo. And I think they come once every, like one or two minutes usually. This is constant flow of these trains in the middle of Tokyo. And so I made a sort of mini to be version of like, did you use this in okay. Yep. Yeah. So you can open and close the doors, the wheels kind of rotate, things like that. So I'm thinking I'll plot this into three js or some other 3d tool, and I don't know what I'm going to do with it, but at least I want to make it run around, I guess that's been my past week or so.
[00:05:26] Speaker A: All right, cool.
All right, we're exploring robotics, and this is something you know a lot about and I know very little about.
So I guess first, if you could kind of give an intro into a little bit of your history with regard to robotics.
[00:05:49] Speaker B: Yeah. So I think for me, I first developed an interest in robotics when the Mars rovers in, I think, was it 2003 or something, 2004 went up and perseverance, or I figure there will be a little picture of it somewhere that pops up on screen about.
So based on that, well, I got really interested in that. And a professor was, I grew up in the Boston area, and I'm still here, but a professor at MIT, that sort of just had a very loose connection with actually let me kind of sit in, in his classes at MIT for about a semester when I was about ten.
So I didn't really learn much at that time from that, but at least having that exposure was really big, and I think was just really a unique way to kind of explore a field when you're really a young kid. So since then.
[00:06:59] Speaker A: I didn't have MIT in my backyard to be able to just go drop in on classes where I was growing up.
[00:07:05] Speaker B: It's lucky that I grew up in an area that's so close. Right. And to do that kind of. There's MIT, Harvard, things like. And I guess pretty much every area has good universities as well.
[00:07:19] Speaker A: Well, different is unique in the cluster that's.
[00:07:24] Speaker B: Yeah. Yeah. So it's certainly lucky on that front. So, having that exposure really made a big difference for me. And from there, I started writing software, because at the time, as a kid, I didn't have enough money to build robots.
But if you write software, that's a lot cheaper. So there was this nice, fat c plus plus book from.
I don't know even when, but that I actually went through later that summer, and that was my first exposure to programming. I think nowadays it's much easier to get into it because there's so many resources. But back then, the way to go is to look at one of these big books and just sort of go through.
[00:08:15] Speaker A: And I guess I learned from many big books. It's fine.
[00:08:20] Speaker B: Yeah, you can see there's robotic stuff. There's a lot of programming. I have some rust books, geometric algebra, like, all sorts of different things over there. So I'm still a firm believer of the old school book strategy.
And then I went to school for electrical engineering and then went to a robot AI company, worked there for a little bit, and then after that, started a robotics company.
[00:08:54] Speaker A: In high school, so. Ten years old, clearly. Is that even middle school? Well, not even quite.
[00:09:01] Speaker B: I was in elementary school.
[00:09:03] Speaker A: Yeah.
So did you do anything in middle school or high school?
Or is it just learning on your own, like the c plus plus? Did you do anything? Were you in. I think they have robot club some schools do or something. Did you do anything during those years?
[00:09:22] Speaker B: So, in high school, at the time, there was a robotics club that had kind of just started at our high school, and back then, they had the battle bots, and that was the only thing they had. And frankly, I personally was not that interested in that.
And so I ended up basically starting my own group within the club that was focused on doing autonomous robot kinds of things.
So we did this competition called botball, and we did okay while I was there. But the year after I left, I think the team that I'd been a part of either think was, like, in the top three or four in the world. I think clearly the people after me did better.
[00:10:17] Speaker A: Is that a thing? What did you call it? Robot ball.
[00:10:21] Speaker B: Bot ball.
[00:10:22] Speaker A: Bot ball. Is that like. Yeah, it's sport for robots, so it's.
[00:10:30] Speaker B: A little robot competition. So they give you some task where it's supposed to simulate some sort of industry situation.
I remember there was one where I think we were supposed to be, like, moving. I think they had, like, you're supposed to move like, oil barrels or something like this, right.
But it's all, like, using legos, and it's very small scale and a big part of that. So I think maybe you've heard of, like, first robotics. I don't know if you're plugged into.
[00:11:06] Speaker A: Like I said, my robotics knowledge is about this thing.
[00:11:09] Speaker B: Yeah, no, but there's a very famous popular competition called first robotics, and that is at a. You know, it's a very high level and all these things. And honestly, the work they do is very impressive. The things that they have sponsors from industry helping the different students in different schools and all these different things.
But the one thing that is an issue is it's very expensive for the schools to run, especially because a lot of these robots are taller than an adult human, and they're very large. Right. I mean, they're these massive machines that drive around, and you not only need sort of the resources to just buy the parts to be able to do that, but you have to have the manufacturing capabilities, you have to have a space to test that out. I mean, even just like a space to store. That can be an issue in a lot of schools.
Bop ball. It's like all the robots are almost handheld, right.
But it was much more focused, kind of on the software side of things, at least when I did that.
[00:12:28] Speaker A: The only connection with regard to legos is, of course, the mindstorm stuff.
But that was still.
I could have potentially introduced my son to that, but still, that was pretty expensive at the time when he was growing up, so we never kind of pursued that.
[00:12:50] Speaker B: Yeah, I think that's kind of the big problem. Right.
One of our newer employees, she doesn't have a background specifically in robotics. She's more like a data analyst.
But she figured it's kind of good opportunity to learn these things. So I was looking at the different tools to kind of learn robotics, and costs have not come down.
You'd think that these little.
Having a little robot that has some sensors on it and things like that can be pretty affordable, but especially nowadays with Arduinos and all these different things like that coming out. But it's actually pretty expensive unless you're willing to actually build it yourself. But in order to do that, you need to know how. Right?
[00:13:41] Speaker A: Yeah, it sounds like that's ripe for somebody.
Like, the raspberry PI is something that came on and made, like, $50 computer, something super cheap. It sounds like there's an opportunity for some company to make some sort of cheap robotics product to do that does something. I don't know.
[00:14:01] Speaker B: Yeah, or maybe a lot of attempts, right? There have been a ton of attempts, but, okay.
I think the problem is that robotics often requires enough computational power, and you might need some sensors and things like that, and very quickly, the cost just shoots up.
[00:14:25] Speaker A: True.
[00:14:27] Speaker B: I think the one. Oh, go ahead.
[00:14:30] Speaker A: No, I was saying, as I was thinking about prepping for this call and thinking about robotics, it's like, okay, you have to know. You can, of course, correct me, but I'm like, okay, you have to know some software engineering. So that's one axis. And then, okay, now you're introducing hardware, and for a lot of software engineers, that's an entirely new axis that they haven't dealt with before. Like, my only exposure was playing around with an Arduino, or however you pronounce that anyway, for a little bit. And that requires a special thing. But now you're introducing robotics, and, yes, your hardware with Arduino may have sensors and whatnot, but you're talking about movement, and it's a whole nother axis of complexity that goes along with it. So I'm like, yeah, that's the trifecta of stuff. You have to learn to be able to do this stuff, at least from my layman understanding.
[00:15:25] Speaker B: Yeah, and that's a good point. And that's actually, I think, one of the biggest reasons that I've been interested in robotics. Right. Because it's a cross section of so many different fields. So you work with people of very different backgrounds, different knowledge, different skill sets, and you have to understand, at least at a baseline, how things work in kind of a different way, and it keeps things always very fresh as far as kind of the different industries that are involved.
You mentioned there's software, there's a hardware component, which very often you're talking about either electronics. So some electronics engineering or electrical engineering kind of thing. In some cases, you might want to be designing your own pcbs.
And then there's sort of the mechanical aspect of you have these physical machines that you're manipulating, usually, but beyond that also there's kind of these big subfields or fields that kind of within engineering, like controls controls engineering, which is.
[00:16:41] Speaker A: I'm sorry, can you repeat that again? What's the comp.
[00:16:44] Speaker B: Controls.
Controls controls is a full field on its own.
And what it is, is basically the math of, let's say you have some desired input, some desired motion.
How do you modify your system to give you some desired output?
And so within that, there's all sorts of different things as well.
[00:17:17] Speaker A: Could you just give me a basic example of that?
[00:17:20] Speaker B: Yeah.
[00:17:22] Speaker A: You're trying to make a robot go from point a to point b. Or is there something, or if you could just give me a basic example.
[00:17:29] Speaker B: Okay, let's say you're a pilot, right? And so you fly a plane, and one of your tasks is to make sure that your plane flies at a certain elevation.
So in order to do that, if you go a little bit, let's say we're trying to get to here, if you go a little bit over, then you try to get the plane to go lower a bit.
That's a very basic controls problem.
But there are many examples, or even.
[00:17:59] Speaker A: Like cruise control in a car.
[00:18:02] Speaker B: Would that be exactly.
[00:18:04] Speaker A: You want to maintain a certain speed, but now you got a hill you got to deal with, so you need to make adjustments to do that.
[00:18:10] Speaker B: Yeah, that's absolutely a common kind of problem like that. I think also, for example, like a thermostat controlling the temperature depending on how your HVAC system has a control loop in it. And it uses these kind of control theory problems within that. I'll also show you, this is called.
[00:18:42] Speaker A: And it's a little bit of feedback loops as well.
The biologist in me is like, okay, feedback loops. Okay.
[00:18:52] Speaker B: Right.
Yeah. So this is actually called an inverted pendulum. So you move basically, how do you drive this car to make sure that this stays up?
Like a segway is a good example of, actually, the interesting thing about it, you mentioned your biology background, but a lot of these kinds of same problems are often used for talking about how to manipulate how much of a chemical to put into some sort of, even, I think, in a lot of sort of biology research and things like that, there's actually a lot of use for this. And with sort of chemical engineering as well. Control theory is a big part of that. So it's a pretty broad field. And actually, just one other thing. You mentioned that there's a feedback loop involved.
Yes, that's true, but I was also a little bit careful not to specifically state that, because in many cases there's something called an open loop control. And with that, you're basically just saying, okay, I want this thing to do to behave like x, and I don't have any sensors to see what the output looks like. I'm just going to assume that it did that correctly. Okay, so that does exist.
And it's actually sort of one of the earlier ways that you kind of think about control when you're going through school studying that as well.
You have your control theory, you have your mechanical engineering. In some kinds of robotics you have to worry about, for example, like pneumatics or hydraulics. So then you have sort of fluids related problems to solve as well.
A lot of robotics research actually comes from sort of biomimicry. So you look at how systems in biology in the animal kingdom, for example, work, and you try to model something that behaves in a similar way. So it's kind of this cross section of so many different fields, and there are so many different ways to approach it. I think, actually with the biology part, too.
The professor I mentioned who pulled me in, he actually has. There's a paper from nature that has a photo that he took of his research on the front cover. And what he was doing was he was using basically light to control muscle tissue to try to make a robot that's powered by muscles, and you feed it blood and things like that.
Really kind of the thing about robotics that I found so interesting is that it really takes just.
It's kind of a meta field, right, where it's not really any one field, and it's kind of a conglomerate of many.
[00:22:24] Speaker A: So that was kind of some of the stuff you did in high school. What did you do in college that kind of continued to put you on the path?
[00:22:33] Speaker B: Yeah, so I think my major was in electrical engineering, and then I did controls and robotics concentration, and then I was an intern at a robot AI company, and that was run by a few professors at different university in Boston area. And they kept telling me that I should drop out so I could work for them full time.
[00:23:05] Speaker A: Guys, you just need to wait a few years or whatever.
[00:23:09] Speaker B: Yeah, right. Yeah, I think at that point it was a few months.
It was only a few months, right? You'd think that it wouldn't be a problem to wait unless they thought you.
[00:23:21] Speaker A: Would be snatched up by someone super quick if they didn't act fast.
[00:23:25] Speaker B: I'd already agreed to work for them after I graduated, so I don't know why there was such an urgency, but I ended up basically just doing that. And then I was working on basically more computer vision and artificial neural networks kinds of things.
And so this was before GPU based? Well, this was before.
[00:23:57] Speaker A: I think this.
[00:23:58] Speaker B: Was basically just after there was that big, I think it was called Google net or something, the first really kind of popular image classifier using convolutional neural nets. So it's basically the first time that machine learning techniques, that neural net techniques really kind of outperformed humans on a task that people thought, wow, computers will never be able to beat us on this.
[00:24:32] Speaker A: And this was for what specifically? A vision problem?
[00:24:36] Speaker B: Yeah, this was like basically classifying. So if you give a picture of a dog versus a horse, being able to tell which it is and to not say that it's a cat.
[00:24:47] Speaker A: Right.
What time period was that, roughly?
[00:24:55] Speaker B: I think that was around 2010. 2011 was when that came out. And then I was doing this work maybe like two years after that or something. But at the time, basically, there wasn't a way of. There was no tooling, really, around a lot of it. Tensorflow hadn't come. That's the really popular library now that hadn't come out yet.
All of the research was in a python library called Theano, which think got discontinued. But it was basically still very early on, kind of this in what we today think of as AI and that realm. So what I was doing, a big chunk of what I was doing there was reading, actually, neuroscience papers and trying to model the activity of electrical activity of neurons.
And I'm not a biologist, I'm not a neuroscientist, so of course, this meant nothing to me. But what I discovered is, at the very back of all of these papers, there's a circuit diagram which describes the activity. So I would look at the title, I would kind of look at the abstract and think, well, I don't understand any of this. And then it would flip to the back and basically derive the equations from the circuit diagram and would implement that.
That was what I was doing when I was an intern there. And then eventually, basically, I was doing implementing early kind of neural nets on mobile phones, on like the iPhone five or something like that, one of those. And at the time, what we would do was you would render.
So there wasn't any information online, of course, about this.
What we were doing was we would render the neural net as a shader, as a fragment shader in OpenGL. So we would do all of our computation and render to a virtual buffer or a texture in OpenGL and then read back the texture in OpenGL from the GPU.
So it was faster than running it on the cpu, but exceptionally slower compared to these days. You'd either use CUDA or if you're talking about on a phone, they basically have APIs specifically for this. If you're not using one of those, you can just use OpenCL. Or if you want to do something closer to what I was doing, essentially both Gles, the mobile version of OpenGL and OpenGL think 4.0 and later have a compute shader where you can actually do gpu computations directly through OpenGL with the intention of it being more of the math rather than for display.
[00:28:31] Speaker A: And just to remind me again, and you were doing this type of vision work, what was the ultimate purpose of it?
[00:28:40] Speaker B: I was doing question.
So a lot of it is basically.
[00:28:47] Speaker A: Okay, bring it around to a practical example of how this impacts robotics, basically.
[00:28:53] Speaker B: Yeah.
So a neural network is basically a system that basically can approximate like a nonlinear system. So you can have some sort of, basically it interpolates.
So if you have a neural net. And I'm going to kind of draw a little bit here.
So, well, let's say we have two points, right? So we have point a and point b and we want to say, okay, and let's say these are like home prices, right?
And so maybe that's the square footage. So we have a 1000 foot home here and we have like a 4000 foot home here. And I don't know why anyone would have a home that big, but I guess some people do.
Maybe this has just a linear relation, right? But kind of figuring out if we have, let's say a 2000 foot square foot square foot home missing. Yeah, literally I don't buy homes. But then if you were to say, okay, well how much does this cost then in this case you might say, okay, well this has a linear relationship. You can make an equation for this and solve it very easily. But what if instead it's a very complex thing and this line instead of like this looks something kind of like this, right?
[00:30:33] Speaker A: Yeah.
[00:30:33] Speaker B: And you have no idea what it's going to be. Or maybe you have, maybe instead of just the square footage, the location matters, the transportation that's nearby matters, maybe the education system matters too. And like the current state of the economy, right? And then, so all these different factors play in and then you suddenly have this very complex system where you can't just write an equation to model it. So what a neural net does is it kind of approximates that. So it basically looks at all these examples and says, okay, it has all these different parameters that are just, and tries to basically fit that system to approximate.
And so these kinds of things are very useful in robotics because, well, for example, with the earlier image recognition problem or a classifier problem, essentially, that's just the same kind of thing that we're doing here. So you have a lot of inputs, which is each pixel on the screen, and then you are told as an output, this is a cat.
And in order to get from these individual pixels on the screen to cat, there's basically a ton of math that goes into it, which largely is just multiplying matrices together.
But that's also kind of the same idea as this, where you might have a bunch of points where these are all different cats, and then you have a different cluster that is like, maybe shoes look like this. And then, I don't know, maybe pandas kind of are somewhere in here. And your model isn't very good at distinguishing between pandas and either of the other two categories. But either way, it's basically taking these input parameters and then trying to guess at what some output state is. So that's essentially what neural nets are.
When I was working on that think it was just sort of generic neural nets related work with kind of the vague understanding that it might be used for vision related things or maybe like decision making of, let's say we have like a block, we have some sort of cube or some object that we want to pick up, and maybe we don't know what the object will look like before we go to pick it up. And so we see we have some robot arm, and our robot arm doesn't know how to approach this. Does it grab it like this, or does it grab it from above? Like, how do I grasp? So there are problems like that. This is called grasp synthesis, by the way, but problems like that, which actually require a lot of the same kind of problem as this classifier or what we were talking about with the price of. It's actually like very similar kinds of problems, but in very different domain.
And all these things are kind of solvable using neural nets. So I was looking at this kind of at a level of, how do can we model a neural net system that's based on how the human brain works to try to learn how to get a computer to learn more abstract things?
[00:34:25] Speaker A: It's like if a human goes up.
Yeah, I think this is really.
Would you say pattern recognition?
[00:34:37] Speaker B: Yeah, because that's what vision is doing.
[00:34:39] Speaker A: When you look at AI.
That's the whole. It's pattern recognition what it's doing now for some, hey, write me a poem. So it uses its reference of a poem. What should that kind of look like? And throw in a little, if jitter is the right word, but little variations to come up with different things each time.
Because if a human looks at a box and say, someone gives you the instructions, go pick up this box, you approach the box, you say, how am I going to pick it up? Am I going to do it this way? Am I going to do it this way?
And then I do it a certain. So you want a robot to basically make that same kind of determination. You've been given instructions. Can they figure out how to do it?
[00:35:25] Speaker B: Yes, it's exactly that. Right. So like you say, llms, like chat, GPT, are also just essentially some sort. It's a generative model, so it generates something instead of just classifying. But it is kind of the same idea. The fundamental piece is it's ultimately just interpolation and pattern matching, essentially, yeah.
A lot of robotic software is actually around this kind of thing now. So if you're interested in going into robotics, AI is, or AI, machine learning, those kinds of areas are very hot right now, including for robotics, and kind of the next step on AI, and you're seeing this a little bit with autonomous cars, but is to kind of bridge that gap between our computer screens and our phones to the physical world.
[00:36:33] Speaker A: So when you were talking about, when you said AI, I don't want to say late to the party, I don't want to put words in your mouth, but you were just getting caught up in it. And I know I was late to the party. I mean, I was basically sleeping. And then it was last December or whatever during the holiday break that my mind exploded looking at all this chad GPT AI stuff, like, what the heck? And then I've mostly been using it for marketing and things over the last year and things of that nature. In terms of the robotics field, how did.
Well, clearly it's been involved in robotics for many years, but how at least the last year, Chad GPT and ll, large language modeling and all that kind of sort of stuff, how has that changed the robotics field now and going forward, do you think?
[00:37:31] Speaker B: I think the biggest thing right now is one of the hardest parts about robotics is the human interface, right? So it's called an HMI human machine interface and sort of where that cross section is between you sitting as a human or standing there doing whatever tasks you're doing and the machine and making sure that, first of all, that the machine tells the person in a not very obnoxious way and not very, you know, in a concise way what it's doing.
So in some cases, it's almost an emotion kind of thing. So you see, with some robots, some robotics companies, they actually hire, like, animators, character animators, to design the motion and, well, I guess, the emotion of the robots, right? So you try to get different behaviors like that.
But the other big part about it is generalizing or making an interface that's easy for humans to work with. So if you can speak to a robot and you can use sort of general language, that's much, much more efficient for us than having to remember, okay, we need to give this specific instruction, or there's often a greater learning curve with that.
I think that's a big part of it, apparently.
[00:39:16] Speaker A: I'm thinking of like, Chad GPT. What has made that so powerful recently for people is that you can give it a prompt in relative English. It doesn't have to be super structured, and it kind of, well, it takes its best guess at what you wanted and then spits some output at you.
Sounds like what people are working on is to achieve the same thing with a robot. Like, if you say, well, I'll say in the future, cook me a steak, it'll know what it's supposed to do if there's a robot that does, right?
[00:39:52] Speaker B: So it can maybe go out and research, watch a couple of YouTube videos on 50 x speed and try to learn how a steak is made, how to cook a steak well, and then I'll go through the New York Times cooking and look at the recipes there and then say, okay, now I'm ready to cook a steak, and then I'll take its arm and move things around. And I think that's actually, in a lot of ways, not too far out.
And you're starting to see a number of companies trying to do exactly that.
So in that sense, I think it's a big shift. But what you might find interesting is actually OpenAI has been around for a much longer amount of time, of course, right? And a lot of the work they did before was actually with robots. So a lot of the problems that we look at, that they looked at initially were things like, if you have paper cups on a table and you want to stack them, then how do you do that? Or if you took the paper cups and a plastic ball, and you put them all in the same environment and just say, okay, well, figure out what to do with this. Then it might put the paper cup face down and then put the ball on top, or it might take ball and put it inside of a cup, or just various things like that, where you're sort of trying to do more exploratory work in that case. So their background is actually a lot more robotics focused than I think a lot of people might be aware of.
They've been a huge innovator in that field for probably about ten years now, something like that.
[00:42:02] Speaker A: So go ahead.
[00:42:04] Speaker B: But, I mean, I think the biggest difference is our biggest impact is just that everyone is thinking about AI now. Right? So with AI, it was kind of the first time that really, we have a generic model, and that's, I think, what's important about Chat GPT is the first time that we have a model that can do natural language processing and is pretty generic in the sense that it knows a lot about a lot of different things, and sometimes it makes things up. But that's really the big thing, is we had these AI models for specific conditions, and we had ways of dealing with it, but now we have a way to do something much more generic.
[00:42:49] Speaker A: Okay, so in terms of robotics, really early on, when you think of robots, hey, they use robots to build cars, and there's a very specific actuated movement. The robot must go here. It expects a part potentially to be there, picks it up, it moves it.
So it's very scripted, I guess, what the robot will do. And I get the sense as you're talking about this AI stuff, they're wanting them to become less scripted and more. I don't want to call it versatile. I don't know if you have a different phrase in the industry, what you would.
When was the robotic shift, or has it always been shifting and just people have been unaware of moving from very programmatic only do these things versus more open field stuff.
[00:43:45] Speaker B: Yeah. So that distinction is basically, you're working either in sort of a structured environment or an unstructured environment, or you have some fixed task that you're doing.
And of course, most robots in a factory today still have more of that structure that you're describing.
A big part of that is speed and cost. So if you can do things quickly.
[00:44:19] Speaker A: And it doesn't do things cheaply, I'm assuming you don't need a lot of computational power to do just do these particular movements.
[00:44:30] Speaker B: No, although actually it's more involved than you'd think, because getting a motor to move and stop at exactly the right location every time is actually trickier than you'd expect, especially when you have these very high powered motors. And I think I saw at a trade show once, a big robotics company called Fannick had a giant robot arm that's really large, and it was picking up and spinning a car in the air. It was just kind of like spinning it around and putting it back down and grabbing it and spinning it. And in order to do that kind of thing, the degree of engineering involved in making sure that the motor stops when you want it to stop is actually tremendous.
Not only. Yeah, I mean, just even from the amount of current that you need to pass through this thing and controlling that, that's not an easy problem.
And the field that I'm in is the AMR field. So these are autonomous mobile robots. So basically, there's this class of robot, which is similar to what you were describing, where they just do a fixed thing, but it was like cars, little carts that would drive around the factory to move things from place a to place b. And before, what they would use is what they call, like, magnetic tapes, or maybe you've seen, like, with the, you mentioned mindstorms earlier, they have a little line follower or something like that.
Basically, it's that kind of technology that started before, and that was called AGV. So I think autonomic guided vehicle.
And so those are still pretty popular, but they have the inherent limitation of, you have to put this marker on the floor. You can't change it on the fly. If there's an obstacle in the way, let's say there's a big pallet or there's a crowd of people or something, then it can't handle that.
So the AMR is basically a vehicle that looks at its surroundings and can say, okay, so I'm trying to get to that point. How do I do it? And it will go and do it itself while following the rules of the factory. It's same idea as autonomous cars, but for inside of the factory.
So that's kind of the field that I'm in. And we do a lot of mapping related things. That's called slam, simultaneous localization mapping, and then also a lot of the navigation perception. So processing your sensor data and then figuring out a big part of what I do. What our company does is predicting the motion of humans or, like, of forklifts and things like that. So current, still somewhat open. Problem is, let's say you have people or other machines in a space then how do you make sure that you disrupt it as minimally as possible? And a big part of that is to just make sure you can predict what people are going to do and make sure you kind of get out of the way, or in some cases, you actually have priority over whoever else is passing through. So being able to kind of manage those decisions is a big part of that.
The main reason that these unstructured problems are becoming so important, though, in manufacturing is because basically a lot of people are interested in flexible manufacturing. So being able to, for example, use the same factory utilities, facilities to build one item and then the next day.
[00:48:58] Speaker A: And retool it for a different, totally different.
[00:49:03] Speaker B: And so that's really where kind of these unstructured things come into play.
[00:49:09] Speaker A: I mean, I was also thinking when you were talking about a robot that follows this very guided path, presumably, that's one robot that has one duty, and it only does this thing, whereas you have something more autonomous.
Now, it can do a couple of different things. Or if one of the robots breaks or shuts down, well, another could take over its role or whatnot.
[00:49:34] Speaker B: Yes, exactly. And then in our case, actually, we do the brains for these robots, right? So we work with big companies who do the manufacturing of the real robots, and they assume a lot of the risk, too, which is nice for me.
But really, the big thing there is we also do, like, robot arms. Some of our customers have robots, vehicles with robot arms as well. We call these mobile manipulators. And so that opens things up even more where we might have, in one case, we have a robot driving around. It grabs some part from, I guess, the warehouse area, just grabs, like, raw, raw material and brings it over to different machine to, like, a machine sticks it in to get processed, then takes that drives almost a kilometer to go to a different machine in a different building and inserts the part into that. If the machine needs some maintenance, there's some basic maintenance where you have to replace tools that it's using if they get dull or whatever. So we can actually do that replacement as well if it needs it. And then we kind of go off and do our thing. And while that's processing, we might go and do some other things, too. So that kind of thing that really kind of opens things up. And then on top of that, there's all this data that we end up being able to collect in order to monitor what's going on in the factory. So in our case, we can tie in with, for example, there's a system called Node Red, which is through. It's basically IBM's drag and drop like node editor for controlling. People use it for smart homes, they use it for factory automation where basically, given some conditions, do something else. So we can tie into that to send you a notification on your phone if something goes down or things like that. Right.
So there's a lot that kind of goes into it. But I think a big part of what makes systems like that attractive is that really the amount of effort required to integrate it into your workflow is pretty minimal. You don't need to uproot anything, you don't need to install a bunch of new systems. You just take this robot, you map out the layout and then you tell it to go and that's pretty much it.
I think that actually does quite a bit for that.
What's also kind of interesting, the robotics world, there's this pretty unique, I think, relationship between the academic side and the industrial side in the sense that a lot of the current work that's done, most companies these days use what's called ROS. So it's robot operating system and it called robot operating system, but it's actually just essentially it's a message passing library and some additional utility libraries and a packaging system. So you can basically get these different programs and you link them all together to create a lot of your robotics systems. So we basically write RoS nodes and then we have a manager on top of that to facilitate all this that we use elixir for.
But it used to be that ROS was a purely academic system.
It was originally I think, started by a group in combination of like Stanford and few other places. And they maybe Toronto, University of Toronto, something like, and small company organization built around that called Willow Garage.
And they ended up basically starting this Ross thing, which at the time no one would ever use it for industrial applications because it was so unreliable, so slow, zero security, nothing. And it's interesting that over the years basically everyone has switched over to using cross. So it's really one of those cases.
[00:54:34] Speaker A: Where I'm assuming it's an open source platform.
[00:54:38] Speaker B: Yes, it's an open source platform.
Originally only ran on Linux, which I was fine with, but some people didn't like very much.
No, but actually that's one thing that's very annoying about it still.
It basically only runs on one very specific. So it runs on Ubuntu and every release is tied. Ros is tied to a specific Ubuntu version.
[00:55:06] Speaker A: That's kind of strange. Okay.
[00:55:09] Speaker B: Yeah, it's a huge pain originally. And this has gone a lot better with what they call ROS two, which is the complete sort of rebuild of the system. So the old packages are no longer compatible, but everyone's been updating them and everything is okay now. But it's really one of those cases where it was a tremendous pain to use because you would try to use, you'd be like, okay, well I want, or actually the real issue is if you want to update a robot on a customer site and you are running an old version of Ubuntu because your stack is running on an old version of Ross, because in order to update your robot system, you have to update the whole operating system.
And in some cases it might just be flashing an image, but it's often a little bit more complicated than that, especially if you want to do like online updates or anything like that. It becomes tremendously difficult to manage.
That's a little bit better now, but still a little bit of an issue.
It's still sort of tied to civic releases and think it also lets you install it on Windows now I think, and maybe opens it up to like two different Ubuntu releases sometimes. But essentially we just run everything in Docker. So I unfortunately know all the GPU related flags for Docker now because of this and all the network related flags. Those are the two things that I learned that I'm pretty confident at this point, for better or for worse.
[00:57:15] Speaker A: So we're running up on time, but I did want to cover a little bit about software engineers. So if a software engineer is watching this, they could be still in school or they could be working for a company now if they wanted to get into robotics, what is a path you would recommend giving the landscape of the environment today? So they know some software development. What are you think some paths they could potentially follow?
[00:57:45] Speaker B: There are so many.
[00:57:46] Speaker A: Maybe just give a few for a different variety, right?
[00:57:51] Speaker B: Yeah, no, at our company we have a few people who basically just do back end server work because our robots need, we coordinate hundreds of robots in a single factory. So when we do that, there's all this networking issue, there's all this network stuff. We have these APIs to run our front end off of. So we have engineers like that, we have some full stack engineers. We have.
[00:58:19] Speaker A: What kind of stuff do you use? Engineers on the back end.
[00:58:23] Speaker B: So we are a little bit unique on that side.
We actually use elixir.
[00:58:33] Speaker A: Is that part of the Phoenix framework or is it just pure.
[00:58:40] Speaker B: Know? Yeah, we do partially use, yeah, but that's part of kind of the Erlang ecosystem of.
So the open telecom platform from Sony Erickson way back when. Or maybe it was just Erickson, I guess, for, um. So we use elixir basically for that OTP platform, and then we actually don't leverage Phoenix as much as we should.
Phoenix Live is really cool, by the way. I don't know if you've ever checked that.
[00:59:22] Speaker A: Know, a lot of people know me as rubyist, but Elixir is the other language that I kind of learned. But a lot of Ruby people kind.
[00:59:29] Speaker B: Of.
[00:59:31] Speaker A: Have gotten led into elixir because one of the people that used to contribute to, I think Ruby on rails started, Jose volim, I think.
[00:59:48] Speaker B: We have that, and we use basically a react based front end.
We do a lot of three JS work. So if you're like a 3d programmer, a lot of game programmers actually go into robotics as well from that standpoint. And that's actually another whole thing is I think video game programming and robotics have actually a lot of similarities in a lot of ways because you're reacting to various environments and you deal with a lot of the same kinds of problems.
Computer graphics as well. And robotics have had a really long history of being very similar because a lot of the coordinate transformations and things have always been an issue there.
[01:00:31] Speaker A: Yeah, I mean, you're dealing with 3d space in both cases.
[01:00:34] Speaker B: A lot of times it's similar. Yeah.
The math is all the same. So if you know the math for one, you can do the math for the other.
I mentioned earlier, like control theory.
If you're a control theory person, then you probably already know that you want to do that. But that's a very common path, is to go from either mechanical engineering or electrical engineering or aerospace or some sort of engineering into a controls focus, either master's phd or even just a concentration like I did. And then to go from there into robotics is pretty common.
Mechanical engineering, computer vision, or AI, I think are the different pathways sort of from the more do it at home. There's a very, I think I haven't actually gone through it, but there's a very popular, I guess I'm doing a plug for a random course, but there's the construct, and they have actually simulator inside of the browser, like a browser based thing that ties into their back end where you can actually learn how to use Ross in order to do some robotics work without ever leaving your desk.
And then I think kind of on the sort of, if you're more interested in kind of in making your own little, like Sparkfun has some little sparkfun I guess these days doesn't have that much for this. But there's little robotics things that you can get from like spark fun or adafruit or. There are a lot of other websites that have things like this where you can kind of learn the basics. From. This one, I'm talking about $114 for basic robotics, but that is kind of a path forward. If you already have some familiarity with software, then using Arduinos and things is a good way to get into it. And then also the self driving car or other robotics courses that kind of follow the computer science track that are available kind of through various websites. So Coursera, edx, things like that have a lot of these kinds of courses. So those are kind of the, I think the easiest ways to get into it if you're kind of curious about it.
[01:03:24] Speaker A: Okay, well, thank you. Well, this has been super insightful because now my robotic style just has increased just a little bit.
So thank you for coming on and discussing and sharing in terms of what's coming up. Another mystery show for next week.
Other than that, you can find us at the rubberdockdevshow.com website.
You can go there to sign up for an email list. Actually going to be putting a bit more in each email. So if you want to get more information about that, you can go ahead and sign up for our email list there. That also has links to all the shows that we've released as well as links to different content. So some of the content we discussed today I'll putting in the episode for this show. So apart from that, I hope everyone has a great week and we'll see you next week. And until next time, happy coding.
[01:04:19] Speaker B: All right, happy coding.