Most robots, especially autonomous robots, need to be clever enough to avoid bumping into obstacles. To do this they need sensors to investigate the environment around them, they need to process this data and identify obstacles in their vicinity. Finally, they need to be able to generate motor commands that steer them clear of any obstacles around them. This is the simplest form of obstacle avoidance and there are tons of examples on the internet, with plenty of little robots that can do this quite nicely.
With this sort of rudimentary obstacle avoidance algorithm, a robot could keep clear of obstacles but it would most likely wander around aimlessly while doing so. Sometimes robots need to be a little more intelligent. They need to reach a goal…perhaps they are chasing a target, perhaps they need to reach one of many way-points along a pre-determined path, perhaps they are headed toward a battery charging point or a position of interest, maybe they are meeting up with a friend, perhaps they need to duck under enemy radar cover while approaching a target!! Whatever the case, these kinds of robots…robots that can navigate intelligently, need to have a slightly more robust obstacle avoidance behaviour built into them.
Clever Robots can avoid obstacles as they head towards a goal
It’s now fairly easy for me to build a robot that can do stuff. Drive around, balance on two wheels, pick up things, drive around some more, transmit video, obey orders, drive some more. While learning all this stuff this has been great fun, robots that just DO things are not much challenge anymore. So I have spent the last few months learning to make my robots more clever, building robots than can observe their environment, make intelligent decisions and re-configure themselves to interact optimally with external stimuli.
For me, vision processing was an easy first choice in trying to build intelligent robotics. The OpenCV library is an incredibly powerful library that one can download for free. OpenCV makes implementing computer based vision extremely easy and once you get more familiar with image processing, you start to see that most operations are just elementary arithmetic operations on matrices. Operations like background subtraction, edge detection, blob detection, kalman filtering and the extremely useful Hungarian Algorithm are all just simple matrix operations. OpenCV is a little tricky to learn, but once you get the hang of it, it’s supremely powerful when it comes to doing interesting things with visual data. I owe thanks for much of what I know about OpenCV to Kyle Hounslow. His video tutorials are a super easy way to get started with OpenCV.
A couple of months ago I used the OpenCV library to build a webcam based vision capable robotic arm. I used Qt and OpenCV to implement the video capture and frame processing.The idea here is to get the computer to track the green ball and then send the correct spatial coordinates to the robot arm which would then follow the ball in space. I used my 6 DOF robot arm and ArduinoTalker c++ class to perform the motion following in the “real” world. The movements are shaky because I was too lazy to implement any smoothing algorithm and the very obvious parallax error is because the camera is fitted to the laptop screen and not onto the arm itself.
An XY plotter is a machine that can control a plotting instrument (such as a pen or a cutting tool like a blade or a laser) over two axes in a accurate, precise manner. Computer Numerical Control (CNC) machines are very accurate XY plotters than can be used for anything from decorating cakes to cutting steel plates into very precise shapes and sizes.
I wanted to make a drawing robot that would be able to draw the contours of a human face, so I decided to experiment with some very basic stepper motors and a cheap toy plotter that I bought on the Internet. Unfortunately, the plotter itself is so poorly manufactured that it is useless as a drawing tool, but the whole project gave me much insight into the steps needed to design a build a proper computer controlled plotting machine.
I’m not a very political person. I don’t care much for international relations, economic policies and other such crap. But here is something that popped up on my screen while I was surfing the web, and I was so appalled that I had to write this post.
It is a poster created by the Australian Customs and Border Protection Service, directed at refugees fleeing persecution, headed towards Australia. Let me reiterate….this is not propaganda from the right wing lobby, this is the voice of the Government of Australia!!!
KEEP OUT: AUSSIES ON PATROL
What disgusts me is that this policy is in complete and utter disregard to Article 14 of the Universal Declaration of Human Rights, which states that every human has a right to seek, and be granted, asylum from persecution. Whats even more outrageous is that Australia has managed to come up with an immigration policy like this even after it has ratified the UDHR.
Reliable high speed wireless connectivity between two or more Arduino boards is something that everyone wanting to get rid of a tabletop tangle of wires will eventually need to implement. As part of my ongoing ERP project, I have decided to modify the ERP chassis to carry a GPS and an array of ultrasonic rangefinders. By establishing a wireless datalink, I hope to be able to build an outdoor mapping robot that would be able to map its surroundings and transmit a 3D image back to a base station.
A Printer’s Hat is very easy to make. Even little children can learn how to fold it very quickly.
It is made from eco-friendly newspaper, fully biodegradable and has many many uses. It makes a super paper plate for chips and munchies at a picnic or at a campsite. Its great for holding soil for seedlings, lined with a plastic bag it can hold water, and painted or plain, it makes an attractive hat!
Here is my youtube video on how to make a Printer’s Hat!
Now that ERP1 is up and running, it needs to be able to fix its position and report this back over the wireless data link. I plan to develop a simple Kalman filter to estimate ERP’s position, but to do this, I need two things…..
(a) An Action Parameter. This will be in the form of a motion vector. Direction will come from a digital compass and Magnitude from a wheel mounted optical encoder.
(b) Data Update. This will the estimated position of ERP1 obtained from an external fixing system such as the popular Global Positioning System (GPS).
The robotic arm is now fully operational. It has 6 Degrees of Freedom and can be controlled remotely from any laptop running the interface software.The robotic hand is capable of simple tasks such as lifting and carrying small objects. I have attached a wireless AV camera to the robotic hand. A human operator can now “see” what the robot is doing and issue commands accordingly over the wireless data link.
Power for the 4 high torque DC motors comes from a single 1.3Ah 12 V battery. The second battery (the taller one) is a 4.5 Ah 6V battery that will power the micro-controller unit (an Arduino Mega) and the six servos that control the robotic arm. Once basic testing operations are completed, I will add two more servos for a pan-tilt sensor mechanism (wireless camera/ sonar ranger/ IR sensor etc) that will also draw power from this 6V battery.