The robotic arm is now fully operational. It has 6 Degrees of Freedom and can be controlled remotely from any laptop running the interface software.The robotic hand is capable of simple tasks such as lifting and carrying small objects. I have attached a wireless AV camera to the robotic hand. A human operator can now “see” what the robot is doing and issue commands accordingly over the wireless data link.
I receive the video using a wireless AV receiver. An EASYCAP II USB device converts AV video into digital form and feeds this to the laptop’s USB port. Next, I use the openCV library to read data from the USB port and interpret incoming data from the wireless camera. Video data is presented tot he human operator using my very own Qt GUI. An Arduino UNO (connected to another USB Port and accessed through my Qt-Arduino Interface Class) is used to relay commands generated at the Qt GUI to the ERP1. In the future, Telemetry data, battery status, and other diagnostics will be relayed to the human operator via this Arduino<—>Qt GUI relay.
At present the video analysis software is limited to just displaying the incoming video data either in raw form or after post processing such as color adjustment/ edge detection/ feature tracking. Luckily the openCV library is incredibly powerful and these complex tasks can be easily performed with a few lines of code.
The final goal is to make ERP1 completely autonomous, having no need for human interaction and be able to act independently within a limited unscripted environment. While this may still be a long way away, there is definitely steady progress in the right direction. This video demonstrates progress so far….
Source code for the Qt GUI, Arduino UNO wireless relay and the Arduino Mega (onboard the ERP) can be found on my Github repository.