The factory camera for the Raspberry Pi comes with a tiny lens that has a Field of View of about 67 degrees diagonal(53 degrees Horizontal and 41 degrees Vertical). As any analemma enthusiast will tell you, this FOV just barely meets the minimum FOV needed (I will post details on this soon) to achieve a full figure-of-eight image with enough room on the sides for a nice foreground image. So the only way I can get my Pi to capture a full analemma is by changing the factory lens.
Luckily this is done quite easily. I followed the excellent instructions posted here to remove my factory lens and fitted an M12 (12mm) mounting bracket from a old broken webcam in place of the factory lens. Unfortunately the mounting screws did not line up with the holes provided on the Pi’s camera PCB and so I just glued it into place. This is what the result looks like…
This is just an archive of links to various Python projects I made while enrolled for the Python programming classes at Coursera.
Feel free to click on the links, play the games and re-use the code as you like (except if you are doing them as part of Coursera’s homework assignments)!!!
Just press the ‘Play’ button on the CodeSkulptor toolbar to start the program.
Start a codeskulptor program
Drum Test. A testing program to simulate a rotating drum originally developed by the Royal Air Force to find suitable candidates for their pilot training programme. This was adapted from a test used to select Bus Drivers in London!
Memory. A simple number based game using the rules of the card game, Memory
Asteroids Clone. Shoot the asteroids. Sound effects will probably only work inside Google’s Chrome browser 😦
Obstacle Avoidance. An obstacle avoidance program for autonomous robots. First set the goal, add obstacles as needed and then step the robot through the arena and observe as it reaches the goal but avoids crashing into obstacles enroute.
TwentyFortyEight. Homework assignment to create a clone of the addictive game 2048. You can find the original version here.
Most robots, especially autonomous robots, need to be clever enough to avoid bumping into obstacles. To do this they need sensors to investigate the environment around them, they need to process this data and identify obstacles in their vicinity. Finally, they need to be able to generate motor commands that steer them clear of any obstacles around them. This is the simplest form of obstacle avoidance and there are tons of examples on the internet, with plenty of little robots that can do this quite nicely.
With this sort of rudimentary obstacle avoidance algorithm, a robot could keep clear of obstacles but it would most likely wander around aimlessly while doing so. Sometimes robots need to be a little more intelligent. They need to reach a goal…perhaps they are chasing a target, perhaps they need to reach one of many way-points along a pre-determined path, perhaps they are headed toward a battery charging point or a position of interest, maybe they are meeting up with a friend, perhaps they need to duck under enemy radar cover while approaching a target!! Whatever the case, these kinds of robots…robots that can navigate intelligently, need to have a slightly more robust obstacle avoidance behaviour built into them.
Clever Robots can avoid obstacles as they head towards a goal
It’s now fairly easy for me to build a robot that can do stuff. Drive around, balance on two wheels, pick up things, drive around some more, transmit video, obey orders, drive some more. While learning all this stuff this has been great fun, robots that just DO things are not much challenge anymore. So I have spent the last few months learning to make my robots more clever, building robots than can observe their environment, make intelligent decisions and re-configure themselves to interact optimally with external stimuli.
For me, vision processing was an easy first choice in trying to build intelligent robotics. The OpenCV library is an incredibly powerful library that one can download for free. OpenCV makes implementing computer based vision extremely easy and once you get more familiar with image processing, you start to see that most operations are just elementary arithmetic operations on matrices. Operations like background subtraction, edge detection, blob detection, kalman filtering and the extremely useful Hungarian Algorithm are all just simple matrix operations. OpenCV is a little tricky to learn, but once you get the hang of it, it’s supremely powerful when it comes to doing interesting things with visual data. I owe thanks for much of what I know about OpenCV to Kyle Hounslow. His video tutorials are a super easy way to get started with OpenCV.
A couple of months ago I used the OpenCV library to build a webcam based vision capable robotic arm. I used Qt and OpenCV to implement the video capture and frame processing.The idea here is to get the computer to track the green ball and then send the correct spatial coordinates to the robot arm which would then follow the ball in space. I used my 6 DOF robot arm and ArduinoTalker c++ class to perform the motion following in the “real” world. The movements are shaky because I was too lazy to implement any smoothing algorithm and the very obvious parallax error is because the camera is fitted to the laptop screen and not onto the arm itself.
An XY plotter is a machine that can control a plotting instrument (such as a pen or a cutting tool like a blade or a laser) over two axes in a accurate, precise manner. Computer Numerical Control (CNC) machines are very accurate XY plotters than can be used for anything from decorating cakes to cutting steel plates into very precise shapes and sizes.
I wanted to make a drawing robot that would be able to draw the contours of a human face, so I decided to experiment with some very basic stepper motors and a cheap toy plotter that I bought on the Internet. Unfortunately, the plotter itself is so poorly manufactured that it is useless as a drawing tool, but the whole project gave me much insight into the steps needed to design a build a proper computer controlled plotting machine.
I’m not a very political person. I don’t care much for international relations, economic policies and other such crap. But here is something that popped up on my screen while I was surfing the web, and I was so appalled that I had to write this post.
It is a poster created by the Australian Customs and Border Protection Service, directed at refugees fleeing persecution, headed towards Australia. Let me reiterate….this is not propaganda from the right wing lobby, this is the voice of the Government of Australia!!!
KEEP OUT: AUSSIES ON PATROL
What disgusts me is that this policy is in complete and utter disregard to Article 14 of the Universal Declaration of Human Rights, which states that every human has a right to seek, and be granted, asylum from persecution. Whats even more outrageous is that Australia has managed to come up with an immigration policy like this even after it has ratified the UDHR.
Reliable high speed wireless connectivity between two or more Arduino boards is something that everyone wanting to get rid of a tabletop tangle of wires will eventually need to implement. As part of my ongoing ERP project, I have decided to modify the ERP chassis to carry a GPS and an array of ultrasonic rangefinders. By establishing a wireless datalink, I hope to be able to build an outdoor mapping robot that would be able to map its surroundings and transmit a 3D image back to a base station.
A Printer’s Hat is very easy to make. Even little children can learn how to fold it very quickly.
It is made from eco-friendly newspaper, fully biodegradable and has many many uses. It makes a super paper plate for chips and munchies at a picnic or at a campsite. Its great for holding soil for seedlings, lined with a plastic bag it can hold water, and painted or plain, it makes an attractive hat!
Here is my youtube video on how to make a Printer’s Hat!
Now that ERP1 is up and running, it needs to be able to fix its position and report this back over the wireless data link. I plan to develop a simple Kalman filter to estimate ERP’s position, but to do this, I need two things…..
(a) An Action Parameter. This will be in the form of a motion vector. Direction will come from a digital compass and Magnitude from a wheel mounted optical encoder.
(b) Data Update. This will the estimated position of ERP1 obtained from an external fixing system such as the popular Global Positioning System (GPS).