Controlling an Owi Robotic Arm with Raspberry Pi Using Keyboard, WebSockets, and OpenCV
How It All Started
I recently received a Raspberry Pi (Model B+) as a gift for my 27th birthday. Naturally, the first thing I did was explore the GPIO (General Purpose Input/Output) pins that can be programmed. I quickly discovered the RPi.GPIO Python library, which supports GPIO manipulation, and I began experimenting with some basic circuits—toggling LEDs, working with an RGB LED, generating speaker beeps, and more.
A few months earlier, a colleague had shown me a webpage featuring robotic arms that caught my attention. The closest retailer selling robotic arms was in the Czech Republic, and they only had one model available—the Owi robotic arm. So, I decided to get it and combine it with my Raspberry Pi.
Connecting the Owi Arm to the Raspberry Pi
The first challenge was connecting the Owi robotic arm to the Raspberry Pi. The model I purchased came with a joystick controller, so I had to figure out how to connect it to the Raspberry Pi. After a few hours of searching, I found a video tutorial that demonstrated how to toggle relays with the Raspberry Pi. I then acquired two relay boards and two 3V DC adapters to connect all the motors to the Raspberry Pi through the relays.
Controlling the Robotic Arm with a Keyboard
Once everything was connected, writing a small script to toggle the relays and control the Owi arm’s joints became straightforward. Using the keyboard, I could now easily control the arm's movements.
Keyframing
The Owi robotic arm is more of a toy than a professional device, so it lacks stepper motors. As a result, the only way to achieve keyframing was to record movements as periods of time. However, I quickly realized that it’s nearly impossible to return the arm to a previous position accurately using just recorded timings for each motor. While you can save positions and replay them, you’ll need to manually reset the arm to its default position from time to time.
One potential solution for resetting the arm to its default position is using OpenCV to recognize visual features of the default position. While this might be an interesting project, it could also be time-consuming and impractical. It might be better to invest in a robotic arm with stepper motors if precise control is a priority.
Controlling the Arm via Wi-Fi
Once I had keyframing working, I wanted to explore other control methods, such as using a local Wi-Fi connection. The easiest approach I found was using WebSockets, allowing control from any device that can browse web pages with JavaScript. Initially, I tried setting up my laptop as a server (using PHP and Node.js), but I encountered issues connecting to the Raspberry Pi.
I then switched to setting up the Raspberry Pi as the server (using Python) and created a simple HTML client with JavaScript. This setup worked, as demonstrated in the video below. The keys "q/a," "w/s," "e/d," "r/f," "t/g," and "y/h" were used to toggle the relays.
Face Detection
The most exciting part of this project was integrating computer vision with the Owi arm. I began by installing the OpenCV Python libraries. Once the libraries were set up, implementing face-following movements was relatively straightforward. If the software detects a face within a rectangle, the Owi arm moves towards it.
However, I was disappointed by the Raspberry Pi’s performance—it only supports a real-time resolution of 160x120 pixels. Any larger resolution introduced unacceptable lag. Additionally, the Owi arm’s motors, being designed for a toy, have speed limitations. There’s no way to adjust the speed; you can only toggle the motors on and off.
Conclusion
The Owi robotic arm is a fun tool and may be suitable for simple projects. However, it falls short for more complex tasks due to its limitations in precision, speed control, and real-time processing capabilities.
how to make it
ReplyDelete