chip updates: something cool we can try in the future or maybe now [updates]
Soooo I was listening to my 2.12 lecture today to see what they were going to say about machine learning and I found this video online.
All this robot has is two discrete tilt sensors and two servos per leg. What if we were to do this for chip through ROS? The inputs to the system would be the foot positions and the tilt sensor angles (the IMU). This approach could allow us to teach robots that are very difficult to model how to walk more efficiently. We would have to limit the xyz conservatively so the robot doesn't over current and such right (which is something we should implement anyways in the current system). So we really need to train a net that takes the following inputs:
x1, x2, x3... x12 --> are current xyz foot positions in the GLOBAL frame (IMU frame)
Bx, By --> roll pitch angles from the IMU
fV, zV, wV --> Robot forward_v, side_v, rotational_v
The robot doesn't even know it has legs in this case. It just sends commands to the legs. And it randomly moves them and makes hypothesis of what it might be. It then it makes a motion that causes the most disagreement - and it will be able to refute what kind of robot it is or isn't.
We put some limits on the outputs and the robot gets to teach itself to walk without damaging itself.
Something to try in future version of chip? When the platform is a little more flexible?