It’s now fairly easy for me to build a robot that can do stuff. Drive around, balance on two wheels, pick up things, drive around some more, transmit video, obey orders, drive some more. While learning all this stuff this has been great fun, robots that just DO things are not much challenge anymore. So I have spent the last few months learning to make my robots more clever, building robots than can observe their environment, make intelligent decisions and re-configure themselves to interact optimally with external stimuli.
For me, vision processing was an easy first choice in trying to build intelligent robotics. The OpenCV library is an incredibly powerful library that one can download for free. OpenCV makes implementing computer based vision extremely easy and once you get more familiar with image processing, you start to see that most operations are just elementary arithmetic operations on matrices. Operations like background subtraction, edge detection, blob detection, kalman filtering and the extremely useful Hungarian Algorithm are all just simple matrix operations. OpenCV is a little tricky to learn, but once you get the hang of it, it’s supremely powerful when it comes to doing interesting things with visual data. I owe thanks for much of what I know about OpenCV to Kyle Hounslow. His video tutorials are a super easy way to get started with OpenCV.
A couple of months ago I used the OpenCV library to build a webcam based vision capable robotic arm. I used Qt and OpenCV to implement the video capture and frame processing.The idea here is to get the computer to track the green ball and then send the correct spatial coordinates to the robot arm which would then follow the ball in space. I used my 6 DOF robot arm and ArduinoTalker c++ class to perform the motion following in the “real” world. The movements are shaky because I was too lazy to implement any smoothing algorithm and the very obvious parallax error is because the camera is fitted to the laptop screen and not onto the arm itself.
An XY plotter is a machine that can control a plotting instrument (such as a pen or a cutting tool like a blade or a laser) over two axes in a accurate, precise manner. Computer Numerical Control (CNC) machines are very accurate XY plotters than can be used for anything from decorating cakes to cutting steel plates into very precise shapes and sizes.
I wanted to make a drawing robot that would be able to draw the contours of a human face, so I decided to experiment with some very basic stepper motors and a cheap toy plotter that I bought on the Internet. Unfortunately, the plotter itself is so poorly manufactured that it is useless as a drawing tool, but the whole project gave me much insight into the steps needed to design a build a proper computer controlled plotting machine.
Reliable high speed wireless connectivity between two or more Arduino boards is something that everyone wanting to get rid of a tabletop tangle of wires will eventually need to implement. As part of my ongoing ERP project, I have decided to modify the ERP chassis to carry a GPS and an array of ultrasonic rangefinders. By establishing a wireless datalink, I hope to be able to build an outdoor mapping robot that would be able to map its surroundings and transmit a 3D image back to a base station.
Now that ERP1 is up and running, it needs to be able to fix its position and report this back over the wireless data link. I plan to develop a simple Kalman filter to estimate ERP’s position, but to do this, I need two things…..
(a) An Action Parameter. This will be in the form of a motion vector. Direction will come from a digital compass and Magnitude from a wheel mounted optical encoder.
(b) Data Update. This will the estimated position of ERP1 obtained from an external fixing system such as the popular Global Positioning System (GPS).
The robotic arm is now fully operational. It has 6 Degrees of Freedom and can be controlled remotely from any laptop running the interface software.The robotic hand is capable of simple tasks such as lifting and carrying small objects. I have attached a wireless AV camera to the robotic hand. A human operator can now “see” what the robot is doing and issue commands accordingly over the wireless data link.
Power for the 4 high torque DC motors comes from a single 1.3Ah 12 V battery. The second battery (the taller one) is a 4.5 Ah 6V battery that will power the micro-controller unit (an Arduino Mega) and the six servos that control the robotic arm. Once basic testing operations are completed, I will add two more servos for a pan-tilt sensor mechanism (wireless camera/ sonar ranger/ IR sensor etc) that will also draw power from this 6V battery.
This project brings together the DIY Haptic Control Glove and the Robotic Hand that I made earlier. The cost of this entire project was less than 25 US$. For details on how they were built and how they work, just follow the link for each.
This video demostrates the complete project.
1. Calibration of the glove
2. Control of the fingers
3. Touching finger tips of little and index fingers to demonstrate
4. Performing a simple task
5. Detail of servo movements
To test the working of a robot hand like the one I built earlier, I needed a haptic control glove that would encode the flexing of my fingers into electrical signals. These signals would be interpreted by a microcontroller (like the ATMEGA328 on the Arduino platform) and cause the servo motors on the robot hand to mimic my finger movements inside the glove. Electronic puppetry.
The word robot comes from the Polish word ‘robota’ meaning forced labour. In Russia, robota means just work, employment or operation. Funny, I’ve spent nearly two years in Russia and have probably spoken this word many many times, never really realising that it is also the root word for robot!
Anyway, this post is a photo-essay/tutorial on how I built my new robotic hand.