In Episode 2 of The Connext Podcast, we continue our discussion of ROS and the challenges in securing robotic systems. We also touch on how aspiring engineers can prepare for a career in security or robotics.
In Episode 2 of The Connext Podcast:
- [0:43] A review of notable security exploits - what happened and how they did it
- [3:15] How vulnerabilities make it through production, specifically in the automotive cases
- [12:00] The process of standardization and IEE’s mistake when creating a security standard - we should all learn from this!
- [17:30] Vince’s advice to aspiring robotics and systems engineers
- [18:05] Where to start if you want to learn more about security and robotics (skills,
languages and websites - oh my!)
Related Content:
- [Whitepaper] The Transition from ROS 1 to ROS 2
- [FREE On-demand webinar] Space Rovers and Surgical Robots: System Architecture Lessons from Mars
- [Product Page] RTI Connext DDS Secure
- [Podcast] ROS & Securing Robotic Systems - Part 1
Podcast Transcription:
Lacey Trebaol: Hi, and welcome to Episode 2 of the Connext Broadcast.
I'm Lacey Trebaol and I'm really excited to share this episode with you. It's Part Two in a two-part series where I'm interviewing Vince DiLuoffo, who is an RTI customer, a Connext user, and also a PhD candidate from Worcester Polytechnic Institute in Massachusetts.
The focus of his PhD thesis is on robotics and security. In Part One of this two-part interview, we discussed ROS, ROS 2.0, and some of the challenges associated with security and robotics.
Now, in Part Two, we're going to expand on this a bit and also talk about the work Vince DiLuoffo is doing for his PhD thesis.
We hope you enjoy.
[0:43] A review of notable security exploits - what happened and how they did it
Vince DiLuoffo: I've just had a couple exploits that, you know, you brought up the Jeep. There was another one that was the Tesla Onboard Network, which was exploited-
Lacey Trebaol: That's right.
Vince DiLuoffo: ... on 2015. There was-
Lacey Trebaol: So what happened in that one?
Vince DiLuoffo: My understanding is they got access to the cam, which is the onboard network, and they were able to take control of the system and exploit it through the information system on the user panel.
Berk Sunar: This was the Nissan. Is that correct?
Vince DiLuoffo: No.
Lacey Trebaol: Tesla.
Vince DiLuoffo: Tesla.
Berk Sunar: Tesla 2? The-
Vince DiLuoffo: Yeah.
Berk Sunar: ... first one was in a Nissan system done by UCSD researchers. And I think they just repeat the same attempt over. And then it also-
Vince DiLuoffo: It could have been.
Berk Sunar: ... watching them repeat it on a Ford Explorer and, you know, going around-
Vince DiLuoffo: So there was a number of exploits that occurred on different platform. There was the other one with the NASA Drone. You had brought up the other drone in Iraq, but this one they exploited and they dumped out like 250 gigs of data that they dumped out off the drone, so ...
Lacey Trebaol: Oh!
Vince DiLuoffo: And these are ... Let's put this that there was security applied to these systems. They did go through a kind of a mature development process, but you still had exploits and these are considered side channels that people were able to still obtain and control, and exploit some of these networks and systems to get information and to dump sensitive information off the systems.
Lacey Trebaol: Right. So I know that in the automotive industry, almost everything that they do on those layers were secured. It becomes an issue. A lot of it's based on standards. And then also a lot of the things that are used, it's ... They're components used by all the different manufacturers, which would make sense as to why if an exploit worked on one, it would very possibly work across a number of vehicles. At least along that family of cars, right? You have one that owns all of those different brands and then they're all vulnerable, potentially, because they-
Bill Michaelson: There's only a few OEM car component manufacturers-
Lacey Trebaol: Right. So the vulnerability gets ... It's very far-reaching. It's not just one company that's usually impacted.
[3:15] How vulnerabilities make it through production, specifically in the automotive cases
How do you test for vulnerabilities in things like this? I know, it's hard for me to say, "Did somebody do their due diligence, right, when they manufacture these thing?" You mentioned they have very ... they have secure ... they consider security, I guess, is the best way to say it. Like security is not something that the automotive manufacturers are not considering, but at the same time, is this a combination of it's not well understood? And so they're not testing for these vulnerabilities that are occurring? How does something like that make it through production? That's a little scary.
Vince DiLuoffo: So I think you're taking the traditional mindset of the auto industry and they had different networks that work on the car.
Lacey Trebaol: Right.
Vince DiLuoffo: So, different closed off systems, in essence, were not penetrated from the outside world or communicated to the outside world, as we're transitioning to more advanced technologies and now cars having capabilities of Bluetooth, wireless, call-home services, these are more of exploit devices and communication paths that are not being tested and protected.
And so when we get into autonomous vehicles, we're layering more technology, more communications, that are not being protected in these safeguards and there needs to be like the security on computer systems, they have a whole bunch of tool sets for going through Pen Testing and things like that. The automotive industry has to experience this and apply these tool sets to that same notion.
Lacey Trebaol: It's funny. So you mentioned earlier the idea of, you know, there are certain things that will occur that are not security threats, right, but so you'd need to know what to filter out. Like be able to tune, in a sense, the security of a system. So, I have a newer Honda CRV and it comes with a level of autonomous driving capability. You can press a button, and then if something is in front of the car and I'm accelerating towards it, it will take over control and stop me. It will apply the brakes. And it also will like stay within lanes and has that control. I can take my hands ... It was very weird the first time to take my hands off of a wheel of a car. I was well on the freeway and the guy in the car from the dealership was like, "Oh, just do it. It works." And I'm like, "This doesn't feel natural," right?
But, so I discovered a fun thing. So, in New England, it snows here, and there's junk on the road that gets churned up. So, you know, all these great features, they involve sensors. The sensor is on the front of the car where the little H emblem is. That sensor got covered in junk and my car started flashing the brake warning like you're car's going to start braking, kind of. It beeps. Like if you get too close to something and you're accelerating, right? So it was falsely getting information injected into that that was alerting me to this. And that's kind of at one level, I guess, some of the stuff you're talking about. It has to learn ... Well, the people doing these things need to learn how to tell the difference and know that's dust on the sensor versus know something's really in front of you. Or know that's a lane that you're staying in versus it's something on the road that isn't actually a lane. What was the one thing it doesn't know
Bill Michaelson: ... more nefarious. If I wanna do something to cause your car to stop.
Lacey Trebaol: Right. It's a vulnerability that I ... Yeah.
Vince DiLuoffo: Well, I think as time ... We say that all systems aren't 100% secure, right? There is always gonna be a vulnerability over time. So through research and people exploiting these value targets, shall I say, there's always gonna be some kind of vulnerabilities.
Lacey Trebaol: Well, I'm not sure I would want a robotic system that was also capable of autonomous operation to be 100% secure. Doesn't that have stuff like Skynet happens? I watched too many movies to want that happening.
Berk Sunar: I don't think with the bugs we have I the systems now, and the process of software engineering or embedded system design and it's unreliable enough for us, I think, to be always safe.
Lacey Trebaol: So we're gaining benefit from the unreliable ... So these are not bugs, they are features, right? Is that ... this is the way we're gonna look at it?
Berk Sunar: For example. But quality, but not too much.
Lacey Trebaol: Yeah.
Berk Sunar: That's perfect.
Lacey Trebaol: Keep the bar just high enough to trip over it.
Like investigating just the concept of security and systems like this and how it would need to be dynamic and tunable, and intelligent, honestly. It's not a solution to be applied like, "Download this package, install it, and run it." It's something that is going to have to really be part of the system and having it's own kind of intelligent operation that's occurring in the system. It can't just be blanket applied. Where do you to look for this? Like, what do I Google?
Vince DiLuoffo: Well, I think we said that, you know, the robotics community and security communities come together, but I still think this is a new growth area-
Lacey Trebaol: So new I can't-
Vince DiLuoffo: ... that needs to be-
Lacey Trebaol: ... Google it?
Vince DiLuoffo: You can take the traditional security and have kind of a base background associated with that. And then look at the robotics area and have a basic sense of that, so you ... But bringing those together? I don't think there's much literature out there associated with this.
Lacey Trebaol: Are you looking into any other areas for kind of inspiration, you know, while the systems are not identical they may be similar in some principles?
Vince DiLuoffo: The current literature that's out there is on ... of a serial learning models and that goes back to Berkeley. Also to a number ... There's an Italian group that's been researching this and a couple of other people that are starting to publish and have different blogs on this notion. So there is some kind of early set material that's being published out there already on just the machine learning aspect to this. And that goes back to what everybody's been saying now, is that we still need a lot of effort that needs to apply into this whole stack associated with the robotics machine learning and security being combined into overall solution.
Bill Michaelson: And individual pieces of the problem are certainly well documented, but putting together the system and understanding the system aspects of integrating all of these pieces together-
Lacey Trebaol: The emergent behaviors.
Bill Michaelson: Stay tuned. There really is no one-size-fits-all, one-stop shopping area for these notions yet. It's being created as we go.
Lacey Trebaol: Good thing you're doing a PhD and not just a Master's work on this, right? I don't think you'd be able to finish in time. You've got years. You're gonna be busy though.
That means you picked the right thing.
Vince DiLuoffo: Exactly.
Lacey Trebaol: That's success. Like the problems you can't Google the answer to, right? You can't go on Quora or Reddit and post something. You've been in the right area. There's just no answer.
Vince DiLuoffo: Well, in that ... and not only the information associated, but they're also dealing with the ... a beta level code associated with the tool set. That's kind of limited in functionality and documentation. So there's constraints also associated with what you're trying to build to and the experiment in those areas.
Lacey Trebaol: So what's the timeline for Ross 2? So right now they're in beta.
Vince DiLuoffo: Right now they're in beta 1, so we see the Ross 2, we see the RTI security as having multiple releases, as bug fixes maturity level comes along. Nobody has given kind of definitive dates of when they're gonna have a full-fledged product on both sides.
Berk Sunar: It's a tough problem.
Lacey Trebaol: It's very tough.
Bill Michaelson: You're also biased towards what you think the vulnerabilities are, not necessarily what somebody else thinks those vulnerabilities are.
Vince DiLuoffo: Right.
Lacey Trebaol: What somebody else knows those vulnerabilities are because they're actually building the system.
[12:00] The process of standardization and IEE’s mistake when creating a security standard - we should all learn from this!
Berk Sunar: Again, if you go back, let's say, like WiFi, for instance, right? I mean, when first laptops with wireless connectivity emerged, right? The ITP Standards Committee got together and drafted a security standard, old web standard. And that is, as soon as it came out, it came out dead. I mean, the first security person who looked at it said, "Okay, this is obviously insecure." And with any new technology, any new progress, unless security people, people who are actually doing this for a living, are involved in the process for standardization. And Ross says, "While we are not at a de facto standards," right, for robotic operating systems.
Lacey Trebaol: Right.
Berk Sunar: So, unless security people are involved from the get-go and actually influence the decisions that are made, in terms of what kind of security features are aboard. And in this case, it's even more difficult because it's not an established area under security. There's so many new pieces, right.
Lacey Trebaol: Yeah.
Berk Sunar: The autonomous behavior, the real-time system, the machine learning, cognitive layers-
Lacey Trebaol: And it's not very well understood. I feel like by any kind-
Berk Sunar: Exactly.
Lacey Trebaol: ... of those experts, right, and-
Berk Sunar: And it's gonna-
Lacey Trebaol: ... how these things-
Berk Sunar: ... be an-
Lacey Trebaol: ... behave.
Berk Sunar: ... evolution, but-
Lacey Trebaol: Right.
Berk Sunar: ... from the get-go, we need to help out, you know, with the ... The security people need to come in and help out, and give you positive feedback in terms of how to develop this in the right way. And you know it's not gonna come out right in the first try. There will be hiccups here and there, but as long as we start early, right, rather than waiting for something huge to develop and then later on try to fix it-
Lacey Trebaol: Right.
Berk Sunar: ... then I think we will have a shot.
Lacey Trebaol: But I also ... I think one of the other added benefits of bringing in the security expertise early on, you know, like when we started doing our secure product, we brought in security researchers. We hired security people and they're working with the OMG group to write up the DDS security spec, and that's what they're doing. But I think one of the neat things that occurs, it's like the mutual education between the people who are distributed systems experts, right, and the people who are security experts. That it's not enough just to bring them into one room, they actually need to teach each other and learn, and build something much better than what would just be achieved, I feel like, by the sum of the two. It can't just be that. It has to be better. It's good that they're doing that, 'cause when you consider how they're ... It's like robots are gonna be doing everything and I'd really like them to do this in a way that doesn't kill us.
Berk Sunar: We can all agree on that, yeah.
Lacey Trebaol: Fully.
Bill Michaelson: Actually there was a-
Lacey Trebaol: Is there a documented case of that occurring?
Bill Michaelson: Yes.
Lacey Trebaol: Oh, no.
Bill Michaelson: Yes.
Berk Sunar: Was that a pacemaker killing somebody?
Bill Michaelson: Yes. Well-
Vince DiLuoffo: Getting hacked.
Bill Michaelson: They got hacked.
Vince DiLuoffo: Yeah.
Lacey Trebaol: They hacked a pacemaker.
Berk Sunar: Actually, a friend of mine. He's a researcher at Universal Michigan now. He was big on this. He testified in front of Congress to ... He ran some attacks. It's surprisingly easy to hack into. And earlier or later, at least if you talk about it, you can find a solution and get the companies to move to do something about it, right?
Lacey Trebaol: Right.
Berk Sunar: But they-
Lacey Trebaol: They need to want ... They need to know they have to fix it.
Berk Sunar: Exactly.
Vince DiLuoffo: That's the ... understatement is notification, having the companies do something about it. Patching their software or their products to accommodate those exploits.
Lacey Trebaol: And it's kind of an interesting thing. We were talking about like a device installed in a person, where you discover there's a way to exploit this, to mess with the operation of it somehow. And you want a solution that does not involve actually having to go back in that person. 'Cause I'm sure that you don't wanna have to go perform, gosh knows, how many thousand surgeries to go and fix this, so those solutions are not simple. It's a literal like box that you're hopefully not ever going into again. Like a new version of the black box mentality is you don't get to go touch those people anymore. There's no anything. You have to fix it on the system side that's owned by that company. And that is scary.
Where was the exploit ... What layer was that carried out at for the pacemaker?
Berk Sunar: Well, I mean, the pacemakers have ... especially the newer ones?
Lacey Trebaol: Yeah.
Berk Sunar: They have wire stability, right, to-
Lacey Trebaol: Right.
Berk Sunar: ... feed off signals, but it turns out you can also send commands, and instructions to restart and so-
Bill Michaelson: Oh, Yeah. And they do that to tune the pacemaker, to tune the operation, so they ... It's a bidirectional data link.
Lacey Trebaol: Right. Which comes in handy when the surgeons are in there and they're working with it. Right. But you hope that capability would no longer exist post-op.
Bill Michaelson: While you're having your-
Vince DiLuoffo: Well, you still-
Bill Michaelson: ... cardiologist will make periodic adjustments, and if they-
Vince DiLuoffo: Yeah., that's right.
Bill Michaelson: You still need software updates.
Lacey Trebaol: How are those adjustments made? Like remotely?
Vince DiLuoffo: Yeah.
Bill Michaelson: Well, not ... I mean, a lot-
Lacey Trebaol: I mean, like how remotely? Like in the-
Bill Michaelson: A lot of the time, I'd say, it's-
Lacey Trebaol: ... room or-
Bill Michaelson: It's in the room across-
Lacey Trebaol: Okay. So you're aware of it.
Berk Sunar: The problem is it doesn't mean anything.
Lacey Trebaol: Right.
Berk Sunar: You say that it can be faked.
Vince DiLuoffo: Right.
Lacey Trebaol: That's true.
Berk Sunar: I can go hide behind the wall and still pump up the signal. It's-
Lacey Trebaol: Exactly.
Berk Sunar: ... you know, be gonna be perceived as to be a-
Vince DiLuoffo: Correct.
Berk Sunar: ... close distance.
Lacey Trebaol: That's how we get good WiFi signal in the corners of our houses.
Vince DiLuoffo: Exactly.
Lacey Trebaol: You can buy those devices-
Vince DiLuoffo: Exactly what I think.
Lacey Trebaol: Yeah. On Amazon.
Bill Michaelson: And it used to take half a million dollars’ worth of equipment in a lab to pull off these exploits. There were a limited number of people that could do it.
Lacey Trebaol: Right.
Bill Michaelson: And now, when you can order all the parts off of Amazon or out of Digi-Key or Mouser-
Lacey Trebaol: eBay.
Bill Michaelson: ... eBay, and be able to put together one of these systems for hundreds of dollars or less, the problem becomes much more widespread.
Lacey Trebaol: So we have advice to people who wanna get involved in security or robotics, or security with robotics, where would you suggest somebody on, who just thought this stuff was awesome ... What should they do? What fields of study should they consider?
[17:30] Vince’s advice to aspiring robotics and systems engineers
Vince DiLuoffo: So I'll start off with follow your dreams and never let anybody tell you that you can't achieve it.
Lacey Trebaol: Good advice.
Vince DiLuoffo: And become a fighter and learn that failure is your friend, not your enemy, 'cause you're gonna go through a lot of iterations where a failure is just gonna come about, and that's how you learn. So, just be patient about that.
[18:05] Where to start if you want to learn more about security and robotics (skills, languages and websites - oh my!)
As far as robotics and learning, I would start off with robotics kits. Try to get your hands, feet, mind around what it is. Put those systems together. Also, things to learn: Matlab, Mathematica-
Lacey Trebaol: Love Matlab.
Vince DiLuoffo: Wolfram is your friend.
Lacey Trebaol: Another great one, yeah.
Vince DiLuoffo: For students, that's the best website that help you get along with some of the answers. Programming languages: C++ and Python are becoming ... Actually, Python is becoming a very big language, especially in the machine learning world.
And then Ross has a number of tutorials that students could learn off of, and put-
Lacey Trebaol: And YouTube has a lot.
Vince DiLuoffo: YouTube is a-
Lacey Trebaol: The Ross tutorial is-
Vince DiLuoffo: ... fantastic-
Lacey Trebaol: ... awesome.
Vince DiLuoffo: Yes. Fantastic. Even there's a number of courses on there that teachers just put up there, and great informational to learn things off of.
Lacey Trebaol: Yeah.
Vince DiLuoffo: And then again, finding mentors, professors, and fellow students. Study groups work well. Just trying to get your notion of helping each other with class material, things like that. Very helpful in that regard.
Lacey Trebaol: Yes. The power of the group.
Vince DiLuoffo: The power of the group.
Lacey Trebaol: So true though. And failure is absolutely ... It's a requirement. You're not doing it right if you're not doing failure. Failure is gonna happen. You don't learn if you don't fail.
Vince DiLuoffo: I think that this kind of set of society right now, they can't fail. They don't wanna fail or somebody props them up to do better. They have to go through that learning to do on their own.
Bill Michaelson: I was gonna say, isn't the ideal model to fail often to succeed sooner?
Lacey Trebaol: Yeah. Really fail often. That's true.
Bill Michaelson: Well, if you're doing your engineering right. If you celebrate failure, you'll have a lot more parties than if you celebrate success.
Lacey Trebaol: Oh, man. It would be like one party every 10 years or something.
And you need that sometimes-
Bill Michaelson: You do.
Lacey Trebaol: Right.
Bill Michaelson: It's ...
Lacey Trebaol: Just figure out why it failed. That's the one that always gets to me. What ... You can't just let it fail and ignore it, guys. You've gotta go back and know why that happened. That's how you get the things that pop up at the last minute. Who it's like, "Wait. Where is this behavior coming from?" Remember that thing that happened five years ago that you didn't investigate?
Bill Michaelson: Every failure is a learning experience.
Lacey Trebaol: Yeah. Absolutely.
Announcer: Thanks for listening to this episode of the Connext Podcast.
We hope you enjoyed it.
If you have any questions or suggestions for future interviews, please be sure to head us up over on social media, and you can also reach out to us at podcast@rti.com
Thanks, and have a great day.