• aditya mehrotra.

chip updates: obstacle avoidance plans [updates]

Sooo here's the thing, after we get this puppy walking we'll start looking into simple things like path planning and stepping over obstacles. And for that, we're going to use a single sensor. The Intel RealSense T265 Tracking Camera. So let's describe why we plan to use this camera and a little bit about CHIP's work environment.


(1) So CHIP is currently designed to operate autonomously outdoors and anywhere via remote control. The thing is, with the T265 tracking camera developers using the platform will be able to use the platform indoors with the V-SLAM that comes on-board the camera.


(2) For our purposes the T265 will be used for obstacle detection and tracking. We know where the camera is with respect to the IMU, we know where the obstacles are with respect to the camera, we know where the feet are, we know the size of the obstacle. Based on that the robot can make decisions about what obstacles to go around and what obstacles to step over. Using the visual-odometer form the camera we can also determine how far the robot has walked in what direction and what angle it is traveling it.


We do plan to implement part (2) as soon as we can get CHIP walking. The order is as follows:

(prerequisites a) get a better leg zeroing-method that isn't eyeballing

(prerequisites b) get the robot balancing with the IMU on two legs. Once we can balance on two legs, we can learn to walk.

(1) chip walking with controller no obstacle avoidance or autonomy

(2) chip walking without controller using ardupilot for autonomy outdoors w/ wireless ground station telemetry

(3) chip walking with or without controller, autonomous or not with the Intel RealSense for obstacle avoidance, and stepping.


This is the plan for CHIP and then we can move to things like MarsPUP.


#updates #chip #walking #autonomy #yay #omgrobots

0 views0 comments
© copyright 2019 | aditya mehrotra