• aditya mehrotra.

chip updates: misssionnn controlllll THANK YOU CRUISE AUTOMATION [updates]

Yesss so a few days ago we tried to get webviz.io/app working with the chip platform right? And we had a long back-and-forth with the people from cruise automation. Finally, I got on a call with JP and with some help from Evan as well, we were able to solve the problem.

(Also can I say, HUGEEEE SHOUTOUT TO CRUISE because have you ever gotten this kind of customer service for a GitHub project? I don't think so! I don't get this kind of support from most companies I've interacted with - love the cruise people massive shoutout to them for being so patient and helpful).

YESSSSSSSS LIVEEEE VIEWWWW CAMERAAAA AND DATAAAAAAA. Oh my god this software is the best thing ever. We're going to spend the next little while making a custom interface for chip, at least the beginnings of one, and download that template.

Here's the link: https://webviz.io/app/?rosbridge-websocket-url=ws://

So I also want to post the link to the GitHub issue and the major points here just for, you know, documentation purposes: https://github.com/cruise-automation/webviz/issues/423

But basically it was a security issue - mostly my bad but I got to meet JP and Evan!

So far we have the two fisheyes, the global GPS position, a full diagnostics panel, the control_mode, the final commands, and we want to add IMU visualization data, that probably won't be today but maybe we can add: http://wiki.ros.org/depthimage_to_laserscan the depth image from the real sense t-265 as a panel?

We should also probably get rid of the global GPS position because we do have QGroundControl. Remember we're going to use this in association with that.

So what we want is the visual-slam output from the intel real sense t-265. I think we need to take the images from the camera and then send it to a visual slam algorithm and then map that output as a laser-scan. But what I want to try first, is using that depth image_to_laserscan to see if we can get a visualization.

Okay so we aren't getting anything from /compressedDepth. Which is really annoying thanks intel. So we're going to look into visualizing the camera depth data and visualizing the IMU data later. What else do we want to know?

So it would also be good to know if there's a heartbeat from the RIO. We might need to add that into the /diagnostics pane.

You know what, let's do that. Let's work on that: http://wiki.ros.org/diagnostics/Tutorials/Adding%20Analyzers%20at%20Runtime haha here it is...

We might even be able to get the IP of our computer... publish things like the SSH address... https://www.geeksforgeeks.org/python-program-find-ip-address/ we can look at this too: https://pythonhosted.org/ifaddr/ but let's start with what we have.

YESSSS look at that! It says we aren't connected to the RIO! I think the name is wrong but we can fix that. Now I want to try to make the diagnostic messages different colors depending on the situation.

And here's that! http://docs.ros.org/api/



So we're going to change that...

Now let's try this! This is great, now we'll know exactly when all the devices are connected and ready!

YESSS IT WORKSSSS I see the message as red now hahaaaaa. Now let's connect a power supply and see if it changes to something that is green when the RIO is connected. Then we can test VBUS and so on as well.

Look at that!!!! We have CMDS, we have /joy, we have the CONTROL_MODE, we have the full system diagnostics, we have /VBUS, we have the cameras!!!!, hopefully soon we'll get the imu data.

Okay so this is what I got so far. I'm going to go ahead and download this thing to my desktop/save this file just because I don't want to lose the configuration even though it's easy enough to re-create. And good, we have a layout.json in my downloads folder. I love well-designed software!

So in the future, what I really wanna do, is take the output of thetas, the IMU data, and a model of the robot and create a 3D visualization for the state of chip at any given moment which would be cool. We might need to look into this as a ROS Node that publishes something like a set of bodies or something like that. So we'll see. But that's the idea.

So that is one half of the "Mission Control" for chip, here's the other half:

Together I think that'll make a really nice station! I can't wait to see this going on on my big monitor.

Special thanks to cruise for not just the great software, but the great support as well!

Yayyyyy visualization (also ROS makes FPV real easy now doesn't it...) Would highly-highly recommend webviz. I'm actually planning to reach back out to JP and suggest one or two features if they have the time. A GPS viewer would be a good one, and so would an IMU viewer that would be a great feature especially for aerial platforms.

#omgrobots #cruise #yay #updates #chip #visuals #webviz

18 views0 comments
© copyright 2019 | aditya mehrotra