Thursday, May 15, 2014

Control your Raspberry Pi robot from a web-connected device

Last issue we built a Raspberry Pi robot. This issue, we’ll create a Python web application that lets you control your bot using a first-person view

After building our Raspberry Pi last issue, we’ve all been thinking up ideas for tutorials to extend Jonny Pi’s skills and abilities. One of the first ideas we had was to control our robot from a web- connected device using a simple web application. To make things even more interesting, we’ve decided to incorporate the Raspberry Pi Camera Module so we can show a first-person perspective of what Jonny Pi can ‘see’. In theory, as long as you’re within Wi-Fi range of your robot, you can drive it around simply by watching the feedback from the app itself.
To do this, we’ll teach you how to use the mjpg-streamer software to stream video straight from the Raspberry Pi camera to your web browser. Once we have that in place, we’ll be converting the movement code from the previous issue’s guide so it can be executed from our little web application, and then we’ll write said web app in Python. It’s as easy as Pi!

Resources

A Raspberry Pi robot (see LUD 132 for instructions)
A Raspberry Pi Camera Module Wi-Fi connectivity
Robot Web App to follow along with
Control the robot
Control the robot

Step by step

Step 01 Connect the camera
The camera should be connected to the Raspberry Pi with the blue side facing the back of the Ethernet port, and the side with the pins facing the SD card. Lift the connector, then slide in the ribbon cable as far as it will go, and push down the connector to secure the cable.
Step 02 Enable the camera
Log into the Raspbian system with the username pi and the password raspberry. You may have an old version that doesn’t support the Raspberry Pi camera. Run sudo apt-get update followed by sudo apt-get upgrade to make sure you are up to date. Once you’ve done that, run sudo raspi-config and select the option to enable the camera. You need to reboot for the changes to apply.
Step 03 Install mjpg-streamer dependencies
We’re going to be using an experimental version of mjpg-streamer that has an input module for the Raspberry Pi camera. It isn’t packaged for the Raspberry Pi, so we’ll need to compile it ourselves. Update the package index with the command sudo apt-get update. We need to install Subversion, which we’ll use to download source code. We’ll also need libjpeg and imagemagick, both of which are required by mjpg-streamer. You can install these with:
$ sudo apt-get install git cmake libjpeg8-dev imagemagick
Step 04 Compile mjpg-streamer
Download and compile mjpg-streamer as shown below:
$ git clone https://github.com/liamfraser/mjpg-streamer
$ cd mjpg-streamer/mjpg-streamer- experimental
$ make clean all
Step 05 Start testing
Before we can start mjpg-streamer, we need to export the directory that it’s in so that the loader knows where to load the various input and output modules from. We export this as STREAMER_PATH so that we have a nice name, but the path actually needs to be in the LD_LIBRARY_PATH for the modules (.so files) to be found. We set these variables like so:
$ export STREAMER_PATH=/home/pi/mjpg-streamer/mjpg-streamer-experimental
$ export LD_LIBRARY_PATH=$STREAMER_PATH
And then start mjpg-streamer in the following way:
$STREAMER_PATH/mjpg_streamer -i "input_raspicam.so -d 200" -o "output_http.so -w $STREAMER_PATH/www"
…where -d $number is the number of milliseconds between captures. You can view the stream by going to http://[pi ip address]:8080/stream.html.
Step 06 Start mjpg-streamer at boot
Make a backup copy of /etc/rc.local with sudo cp bak. Then edit /etc/rc.local (with sudo) to contain all four lines from the previous step, adding a space followed by an ampsersand to the end of the last line, making sure that exit 0 is still at the end. Reboot the Pi to make sure it works. The end of your rc.local file should look as below:
Step 07 Install Apache and required modules
We’re going to be using Apache as our web server, WSGI as our method of executing Python from the web server, and Pyro (Python Remote Objects), to call functions to move the camera and the robot without needing to run the web server as root to access the GPIO. (mjpg-streamer is actually running as root when executed from rc.local, but that could be easily fixed by making it run as the www-data user instead. Alternatively, you could make it listen on localhost only and reverse proxy web requests for video from Apache to mjpg- streamer.) Install the above packages using:
$ sudo apt-get install apache2 libapache2-mod-wsgi pyro
We also need to start a Pyro nameserver so we can connect to our movement module remotely later on. Edit /etc/default/pyro-nsd and change ENABLED to equal 1. Then start pyro-nsd:
$ sudo /etc/init.d/pyro-nsd start
Step 08 Configure WSGI
To use WSGI, we’ll want to create a user to run the code as. Our expert called his robotweb. Use the command sudo adduser robotweb with a password of your choice. Then edit /etc/apache2/sites-enabled/000-default using sudo and your favourite editor. Add the following lines inside of the VirtualHost tags just before the ErrorLog section:
WSGIDaemonProcess robotweb user=robotweb group=robotweb processes= 1 threads=2
WSGIProcessGroup robotweb
WSGIScriptAliasMatch /action /home/robotweb/app.py
Step 09 WSGI Hello World
Switch user to robotweb using su robotweb. Open /home/robotweb/app.py in your favourite editor and add the code at the top of the next page. WSGI always starts by calling the application function of the file you give it, passing through a dictionary representing the environment, and also a callback function called start_response that you use to send it the HTTP status code and any response headers. Once it has those, the main content is returned as a list – in this case containing one string.
Then use exit to get back to the pi user and type apachectl graceful. Any error about Apache not being able to determine the server’s name can be safely ignored. http://[your pi]/action shows you ‘Hello World!’.
Step 10 Create the movement code
The next piece of code is based on the movement and ultrasonic sensor code from last issue’s robot feature. However, there’s quite a lot of code we won’t need from that (the ultrasonic part), so we’ve rewritten it, adding in the ability to call the code remotely. We’ve called the file movement_server.py (find it on the disc). You’ll also need to make the file executable with chmod +x. You may want to change the speed constant if your robot is too slow or too fast during movement.
The movement class needs to inherit the Pyro Object Base so that it can be called successfully from a Pyro client. For this reason, we need to initialise the base class as part of the movement class’s initialisation function. Apart from that, it’s pretty much a normal class.
For each direction, we set the GPIO voltage to either HIGH or LOW on the appropriate pins, then start the motors, sleep for the amount of time to move, and then call the stop function, which simply changes the motor speed to 0.
Finally, we set up a Pyro server by registering with the Pyro nameserver as robotmovement and starting a request loop, which simply waits for requests from a Pyro client and executes the appropriate function.
Step 11 Start the movement server on boot
As we did before with mjpeg-streamer, we want to start our movement server on boot by adding it to /etc/rc.local before exit 0. Our expert used the following line:
/home/robotweb/movement_server.py &
Step 12 Our web frontend
We’re going to be using Twitter’s Bootstrap 3 framework for our web front- end, as it looks good and is simple to use. Change directory to /var/www. This directory is owned by root, so you’ll want to become root using sudo su. Start by removing the default Apache start page with rm index.html. Then download and extract the bootstrap files:
$ wget "https://github.com/twbs/bootstrap/releases/download/v3.0.0/bootstrap-3.0.0-dist.zip"
$ unzip bootstrap-3.0.0-dist.zip
$ mv dist/* .
$ rm -r bootstrap-3.0.0-dist.zip dist/
Bootstrap requires jQuery, so we’ll also want to download a copy of that, as well as a jQuery plug-in that lets us submit a form without reloading the page. It would be annoying for our camera stream to disappear each time we wanted to command the robot.
$ cd js/
wget "http://malsup.github.com/jquery.form.js"
wget "http://ajax.googleapis.com/ajax/libs/jquery/1.7/jquery.js"
Step 13 Start the webpage
The next bit of code we need to do is our index.html. It should be put in /var/www/ index.html, replacing the default Apache page we removed in the previous step. You can find it on the disc. We set a title, import the bootstrap style sheet and then define our own style to limit the width of the page. After that, we import the required JavaScript and create a simple bit of code that handles the move form (which we’ll be creating shortly) with AJAX rather than typical HTTP requests.
Step 14 Main body of index.html
The body has a header, a container that displays our video stream, and a form that allows us to send a direction – and a period of time to head in that direction for – to our Python web application. When a button is clicked, the value of the seconds text box is sent to the action script, as well as the direction, which comes from the value field of the button that was clicked.
Note that the IP address of the Pi is hard- coded because we need to access the stream on port 8080. This could be improved by giving the Pi a DNS address or reverse-proxying the video stream through port 80 using Apache.
Step 15 Calling the movement code from our web app
Our web application is really simple. It just takes a direction to move in and an amount of time to do it for. The code across the page should replace the Hello World code we put in /home/robotweb/app.py. It’s also on the disc. We import the Pyro core so we can use it to remotely call code on the movement server, and also import the parse_qs function from the cgi module. This allows us to easily parse query strings (which are sent by the webpage we created in the previous steps) and turn them into a dictionary.
We then connect to our movement server, that we registered with the Pyro nameserver as robotmovement. Once we’ve connected to there, we get our parameters as a dictionary and check that we were sent a valid request. If so, we get the first direction, and number of seconds in the list (there will only be one item in each list), and call the appropriate move function.
If everything went okay then we return a success message, or otherwise a bad request message. However, the user won’t see these because they are sent using jQuery so that the camera view doesn’t keep refreshing.
Step 16 Yet more testing
Reload Apache with apachectl graceful. Any error about Apache not being able to determine the server’s name can be safely ignored. Head to the webpage by typing your Pi’s IP address into a web browser. From there, use the buttons and the amount of seconds to move for to control the robot’s movements. The stop button should override all other buttons.
Notice that we always stop the robot after the time limit, so going forward for 5 seconds and moving backwards for 5 seconds, 3 seconds into the forward movement would move the robot back for 2 seconds and then stop the robot. To solve this potential problem, you could implement a queuing system.
Step 17 How to disable what we’ve done
You may not want a camera broadcasting footage over the network constantly for obvious reasons. If you want to easily disable what we’ve just done, you can comment out the four lines we added to /etc/ rc.local by adding a hash character (#) to the start of those lines. You can disable Apache on boot in Debian with sudo update-rc.d -f apache2 remove (this will need redoing each time Apache is updated) and enable it with sudo update-rc.d apache2 defaults. sudo 0/etc/init.d/apache2 start/stop will start or stop Apache manually.
Finished Site
Finished Site
Step 18 Further improvements
This article is a good base for a web interface for your Raspberry Pi robot, but there are many improvements that you could make, such as:
» Adding pan-and-tilt capability from last issue’s article to the web interface.
» Reverse-proxying the video stream through Apache, as mentioned previously.
» Adding authentication so that only people with a username and password can control the robot and view the stream.
» Using SSL for a secure video connection.
» Working out how long it takes for the motors to complete a full circle and using that info to be able to rotate the robot in degrees.
» Adding a queueing system for commands, as mentioned in step 16.

No comments:

Post a Comment