Previous Entry Share Next Entry
Programming Snake Arm
avatar
rus_dikobraz

Science is like sex: sometimes something useful comes out, but that is not the reason we are doing it. R. Feynman





Recently a new robot was born in the our lab - the snakearm for tube inspection (by Igor Orlov).




Frankly speaking, there are plenty of robots in our lab. Ranging from small hexapods to giant manipulators and prototypes of walking machines requiring kilowatts of energy. But you hardly find anything moving, because no one cares about programming them. Almost all of these projects are dropped just after construction, realization of a few reference trajectories and grabbing results for PhD or for grant report. Thus students are joking that we make not the robots but their corpses. The only light spot is eurobot robots, but this is completely different story.



By chance Nataly doesn't work for now, I've got a few weeks of leave and we've got a vague proposal to construct some commercial manipulator system. To proof our abilities we should've been make this robot to follow a line drawn on the board. I still don't believe that something will come out of this, but we've rushed onto this project.



We started from line identification. We also drawn a frame on the board to determine transformation from the camera coordinate system to world space and to prune false lines. We are not much experienced with computer vision, so the chosen approach may be error prone or ineffective. Anyway we haven't time and desire to make a super effective and robust detector. Trying to restore university cv courses in our head we ended up with the following workflow:




  1. Bilateral filter with small kernel (we hoped it will remove noise and compression artifacts of our camera).

  2. Single-Scale Retinex.

  3. Threshold binarization.

  4. Median filter to remove noise after binarization.

  5. Morphology noise removal (but we removed it in final version, it was almost useless).

  6. Canny edge detector.

  7. Find the largest contour and the largest contour inside it. These contours are considered to be the border that the line respectively.

  8. Find transformation from image space to world space using this formula.



Here are some pictures of how it looks like after some of these steps:


And it worked pretty well in all our conditions. We only had problems with gleams on the board. Board is very reflective, sometimes glares split our border and broke our algorithm. Thus you sometimes will see a lamp lightening the board, it dissolves glares from upper light.



It is a good idea to organize a workspace even for such small problems. We constructed improvised testing installation using network camera and a printer (although it was used in an unusal way).




And this is the installtion for night time work :).




After more than 13 hours of zealot coding and several hours of integration in a very tense atmosphere :)




We've finally ended with first working demo. Installation was constructed by Igor Orlov and Anton Alyseychik.






We were cheered by success, although I expected better result. As you can see the structure is very elastic. There are many joints, steel plates are not hardened and too thin. For the end link static error is about 7cm due to deformation. And vibration produce horrible effect.



Now it is time to reveal manipulator design and controller architecture. Arm uses mega super torque servos HS-M7990TH and lynxmotion SSC-32 servo controller. Thus the controller is very simple, we just need to compute desired joint angles and feed them to controller. Afterwards controller itself will generate PWM encoding target servo position and integrated servo controller will implement high-gain high-frequency PID logic to stabilize servo.



We don't precompute the whole trajectory using inverse kinematics, we just compute the change of the servo angles required to track trajectory or to move to desired position. More precisely we wish to compute the joint velocity , so that the end-effector velocity is close to . We also introduce diagonal regularization coefficients matrix to handle ill-conditioned problems near singularities and to weighted joint angles. By specifying lower coefficients for joints closer to end-effector we encourage them to be more mobile than ones closer to root. Controller is relatively easy:




Jacobian J(q) is computed numerically (see Introduction to inverse kinematics with jacobian transpose, pseudoinverse and damped least squares methods by Samuel R. Buss). Desired velocity is computed considering that we track trajectory with constant velocity.



This controller is good enough for this robot, but it doesn't take into consideration link elasticy. I was against software solution for this problem. Somehow we hardened the most problematic root joint, but the problem persisted. We suggested that the link bends proportionally to the moment produced by the gravity force. So if we add some delta to the joint angle, measuring the torque produced by gravity force, we should compensate this elasticy. If the is the i'th link center of mass and is the i'th link mass, then this delta is computed as follows:




Finally we have made a presentation. Job opportunity still looks very vague. Some guys want teleoperated platform with manipulator to work in the hazardous areas of the facility then people can't do that. Here is the final video with our presentation for them.





?

Log in

No account? Create an account