chip updates: tf's and music notes [updates]
OKAY SO TODAY I'm going to look into two things on the robot. The first, is the very brief discussion I had yesterday with the tf in ROS and see if we can get a tf out of the IMU (the mavros/ardupilot one) and visualize it on webviz. This is useful to know what orientation the robot thinks it's in. That's the first thing we're going to look into.
IGNORE THIS BRAIN DUMP:
So we're going to look at this link: http://wiki.ros.org/tf/Tutorials/Adding%20a%20frame%20%28Python%29 and use this to publish two coordinate frames (1) a base link frame, and (2) an IMU frame!
We're going to write a node that subscribes to /mavros/imu/data and outputs, the two tfs. So currently, we are using tf2, and we're taking the base_link and linking the IMU frame to base_link the problem is, I don't think we've connected the MAP to the base_link so we need to complete that tree. I think the rod global coordinate frame is /map.
We really might need to plug in a monitor to see if this transform tree is working but let's try this first.
OKAY SO GOOD NEWS. We have linked/base_link_frd to the IMU_frame. But now I think we need to create a visualization message: http://docs.ros.org/melodic/api/visualization_msgs/html/msg/Marker.html
So actually... we've spent this time trying to create a tf between some link and some other link. What we should be doing is: https://github.com/cse481sp17/cse481c/wiki/Lab-12:-Creating-Custom-Visualizations-in-RViz-using-Markers
We need to make it a certain shape for this thing to work not a TEXT marker like the thing above: http://wiki.ros.org/rviz/Tutorials/Markers%3A%20Basic%20Shapes
We can try using a CUBE! and we can ALSO use the TF matrix we generated earlier! we don't need to choose between both - I just don't think I zoomed in enough in webviz!
And then we'll add two more marker for TEXT that represents the back and left of the robot. So we don't get confused.
Here's a more though-out log of what we did:
So the goal today, at least initially, was to get the IMU visualization working in webviz because that would be really cool to see what the robot thinks its orientation is and to see actually what it is at any given moment. We're using not the RealSense Camera's IMU for orientation but the ArduPilot because that's mounted closest to the center of the robot and that's the IMU that will give us the most accurate sense of the whole platform's orientation at any given moment.
So what did we do? After talking to JP, he told me that the 3D viewer in webviz can visualize tf matrices and webviz.io told me that it can visualize markers. So this if the first time I've worked with either of these but it was time to learn, and I used the links in my babbling "let's try this" section above. But here's the summary. The first thing I did was go and figure out what the true base_link of the robot was being published and I think the real sense camera node or mavros is publishing a base_link_frd transformation matrix and that's basically the global "ZERO" of the robot.
Some sensor was already punishing base_link_frd, and that's the coordinate frame of the world right under the robot essentially. So the first think I did was use the tf2_ros package to setup a transformation matrix between base_link_frd and imu_data. So we got the arrows visualization if you zoom in.
Then what we did was create a cube marker that would have the same position and orientation as the imu_frame and center it at the imu_frame origin just to help us see a box that moves around to represent the robot.
Then I remember that actually the coordinate frame of the Ardupilot is not what we expect. This is why I think the base_link_frd is published from some mavros node because it's in the same coordinate frame. (Unless all IMUs use the kind of frame). So we placed a text marker to signal the back of the robot.
Here's the result:
So the x-axis points towards the back, the z points down, and the y points to the left of the robot. So now we can actually see why we kept having to negate random things in stabilization a while back, we though x went forward and z went down. Anyways, this is a really good tool because it'll show us what the robot thinks its orientation is at any moment.
Also, this tool showed me something else. Our compass calibration is garbage. And the reason I know that is this thing keeps turning around its z-axis meaning it thinks its yawing wildly when it isn't. (unless the GPS fix is good then it doesn't). Anyways, that's something to fix later. We have IMU Visualization!