Shallow Thoughts : tags : raspberry pi

Akkana's Musings on Open Source Computing, Science, and Nature.

Thu, 03 Jul 2014

Detecting wildlife with a PIR sensor (or not)

[PIR sensor] In my last crittercam installment, the NoIR night-vision crittercam, I was having trouble with false positives, where the camera would trigger repeatedly after dawn as leaves moved in the wind and the morning shadows marched across the camera's field of view. I wondered if a passive infra-red (PIR) sensor would be the answer.

I got one, and the answer is: no. It was very easy to hook up, and didn't cost much, so it was a worthwhile experiment; but it gets nearly as many false positives as camera-based motion detection. It isn't as sensitive to wind, but as the ground and the foliage heat up at dawn, the moving shadows are just as much a problem as they were with image-based motion detection.

Still, I might be able to combine the two, so I figure it's worth writing up.

Reading inputs from the HC-SR501 PIR sensor

[PIR sensor pins]

The PIR sensor I chose was the common HC-SR501 module. It has three pins -- Vcc, ground, and signal -- and two potentiometer adjustments.

It's easy to hook up to a Raspberry Pi because it can take 5 volts in on its Vcc pin, but its signal is 3.3v (a digital signal -- either motion is detected or it isn't), so you don't have to fool with voltage dividers or other means to get a 5v signal down to the 3v the Pi can handle. I used GPIO pin 7 for signal, because it's right on the corner of the Pi's GPIO header and easy to find.

There are two ways to track a digital signal like this. Either you can poll the pin in an infinfte loop:

import time
import RPi.GPIO as GPIO

pir_pin = 7
sleeptime = 1

GPIO.setmode(GPIO.BCM)
GPIO.setup(pir_pin, GPIO.IN)

while True:
    if GPIO.input(pir_pin):
        print "Motion detected!"
    time.sleep(sleeptime)

or you can use interrupts: tell the Pi to call a function whenever it sees a low-to-high transition on a pin:

import time
import RPi.GPIO as GPIO

pir_pin = 7
sleeptime = 300

def motion_detected(pir_pin):
    print "Motion Detected!"

GPIO.setmode(GPIO.BCM)
GPIO.setup(pir_pin, GPIO.IN)

GPIO.add_event_detect(pir_pin, GPIO.RISING, callback=motion_detected)

while True:
    print "Sleeping for %d sec" % sleeptime
    time.sleep(sleeptime)

Obviously the second method is more efficient. But I already had a loop set up checking the camera output and comparing it against previous output, so I tried that method first, adding support to my motion_detect.py script. I set up the camera pointing at the wall, and, as root, ran the script telling it to use a PIR sensor on pin 7, and the local and remote directories to store photos:

# python motion_detect.py -p 7 /tmp ~pi/shared/snapshots/
and whenever I walked in front of the camera, it triggered and took a photo. That was easy!

Reliability problems with add_event_detect

So easy that I decided to switch to the more efficient interrupt-driven model. Writing the code was easy, but I found it triggered more often: if I walked in front of the camera (and stayed the requisite 7 seconds or so that it takes raspistill to get around to taking a photo), when I walked back to my desk, I would find two photos, one showing my feet and the other showing nothing. It seemed like it was triggering when I got there, but also when I left the scene.

A bit of web searching indicates this is fairly common: that with RPi.GPIO a lot of people see triggers on both rising and falling edges -- e.g. when the PIR sensor starts seeing motion, and when it stops seeing motion and goes back to its neutral state -- when they've asked for just GPIO.RISING. Reports for this go back to 2011.

On the other hand, it's also possible that instead of seeing a GPIO falling edge, what was happening was that I was getting multiple calls to my function while I was standing there, even though the RPi hadn't finished processing the first image yet. To guard against that, I put a line at the beginning of my callback function that disabled further callbacks, then I re-enabled them at the end of the function after the Pi had finished copying the photo to the remote filesystem. That reduced the false triggers, but didn't eliminate them entirely.

Oh, well, The sun was getting low by this point, so I stopped fiddling with the code and put the camera out in the yard with a pile of birdseed and peanut suet nuggets in front of it. I powered on, sshed to the Pi and ran the motion_detect script, came back inside and ran a tail -f on the output file.

I had dinner and worked on other things, occasionally checking the output -- nothing! Finally I sshed to the Pi and ran ps aux and discovered the script was no longer running.

I started it again, this time keeping my connection to the Pi active so I could see when the script died. Then I went outside to check the hardware. Most of the peanut suet nuggets were gone -- animals had definitely been by. I waved my hands in front of the camera a few times to make sure it got some triggers.

Came back inside -- to discover that Python had gotten a segmentation fault. It turns out that nifty GPIO.add_event_detect() code isn't all that reliable, and can cause Python to crash and dump core. I ran it a few more times and sure enough, it crashed pretty quickly every time. Apparently GPIO.add_event_detect needs a bit more debugging, and isn't safe to use in a program that has to run unattended.

Back to polling

Bummer! Fortunately, I had saved the polling version of my program, so I hastily copied that back to the Pi and started things up again. I triggered it a few times with my hand, and everything worked fine. In fact, it ran all night and through the morning, with no problems except the excessive number of false positives, already mentioned.

[piñon mouse] False positives weren't a problem at all during the night. I'm fairly sure the problem happens when the sun starts hitting the ground. Then there's a hot spot that marches along the ground, changing position in a way that's all too obvious to the infra-red sensor.

I may try cross-checking between the PIR sensor and image changes from the camera. But I'm not optimistic about that working: they both get the most false positives at the same times, at dawn and dusk when the shadow angle is changing rapidly. I suspect I'll have to find a smarter solution, doing some image processing on the images as well as cross-checking with the PIR sensor.

I've been uploading photos from my various tests here: Tests of the Raspberry Pi Night Vision Crittercam. And as always, the code is on github: scripts/motioncam with some basic documentation on my site: motion-detect.py: a motion sensitive camera for Raspberry Pi or other Linux machines. (I can't use github for the documentation because I can't seem to find a way to get github to display html as anything other than source code.)

Tags: , , , ,
[ 20:13 Jul 03, 2014    More hardware | permalink to this entry | comments ]

Thu, 26 Jun 2014

A Raspberry Pi Night Vision Camera

[Mouse caught on IR camera]

When I built my http://shallowsky.com/blog/hardware/raspberry-pi-motion-camera.html (and part 2), I always had the NoIR camera in the back of my mind. The NoIR is a version of the Pi camera module with the infra-red blocking filter removed, so you can shoot IR photos at night without disturbing nocturnal wildlife (or alerting nocturnal burglars, if that's your target).

After I got the daylight version of the camera working, I ordered a NoIR camera module and plugged it in to my RPi. I snapped some daylight photos with raspstill and verified that it was connected and working; then I waited for nightfall.

In the dark, I set up the camera and put my cup of hot chocolate in front of it. Nothing. I hadn't realized that although CCD cameras are sensitive in the near IR, the wavelengths only slightly longer than visible light, they aren't sensitive anywhere near the IR wavelengths that hot objects emit. For that, you need a special thermal camera. For a near-IR CCD camera like the Pi NoIR, you need an IR light source.

Knowing nothing about IR light sources, I did a search and came up with something called a "Infrared IR 12 Led Illuminator Board Plate for CCTV Security CCD Camera" for about $5. It seemed similar to the light sources used on a few pages I'd found for home-made night vision cameras, so I ordered it. Then I waited, because I stupidly didn't notice until a week and a half later that it was coming from China and wouldn't arrive for three weeks. Always check the shipping time when ordering hardware!

When it finally arrived, it had a tiny 2-pin connector that I couldn't match locally. In the end I bought a package of female-female SchmartBoard jumpers at Radio Shack which were small enough to make decent contact on the light's tiny-gauge power and ground pins. I soldered up a connector that would let me use a a universal power supply, taking a guess that it wanted 12 volts (most of the cheap LED rings for CCD cameras seem to be 12V, though this one came with no documentation at all). I was ready to test.

Testing the IR light

[IR light and NoIR Pi camera]

One problem with buying a cheap IR light with no documentation: how do you tell if your power supply is working? Since the light is completely invisible.

The only way to find out was to check on the Pi. I didn't want to have to run back and forth between the dark room where the camera was set up and the desktop where I was viewing raspistill images. So I started a video stream on the RPi:

$ raspivid -o - -t 9999999 -w 800 -h 600 | cvlc -vvv stream:///dev/stdin --sout '#rtp{sdp=rtsp://:8554/}' :demux=h264

Then, on the desktop: I ran vlc, and opened the network stream:
rtsp://pi:8554/
(I have a "pi" entry in /etc/hosts, but using an IP address also works).

Now I could fiddle with hardware in the dark room while looking through the doorway at the video output on my monitor.

It took some fiddling to get a good connection on that tiny connector ... but eventually I got a black-and-white view of my darkened room, just as I'd expect under IR illumination. I poked some holes in the milk carton and used twist-ties to seccure the light source next to the NoIR camera.

Lights, camera, action

Next problem: mute all the blinkenlights, so my camera wouldn't look like a christmas tree and scare off the nocturnal critters.

The Pi itself has a relatively dim red run light, and it's inside the milk carton so I wasn't too worried about it. But the Pi camera has quite a bright red light that goes on whenever the camera is being used. Even through the thick milk carton bottom, it was glaring and obvious. Fortunately, you can disable the Pi camera light: edit /boot/config.txt and add this line

disable_camera_led=1

My USB wi-fi dongle has a blue light that flickers as it gets traffic. Not super bright, but attention-grabbing. I addressed that issue with a triple thickness of duct tape.

The IR LEDs -- remember those invisible, impossible-to-test LEDs? Well, it turns out that in darkness, they emit a faint but still easily visible glow. Obviously there's nothing I can do about that -- I can't cover the camera's only light source! But it's quite dim, so with any luck it's not spooking away too many animals.

Results, and problems

For most of my daytime testing I'd used a threshold of 30 -- meaning a pixel was considered to have changed if its value differed by more than 30 from the previous photo. That didn't work at all in IR: changes are much more subtle since we're seeing essentially a black-and-white image, and I had to divide by three and use a sensitivity of 10 or 11 if I wanted the camera to trigger at all.

With that change, I did capture some nocturnal visitors, and some early morning ones too. Note the funny colors on the daylight shots: that's why cameras generally have IR-blocking filters if they're not specifically intended for night shots.

[mouse] [rabbit] [rock squirrel] [house finch]

Here are more photos, and larger versions of those: Images from my night-vision camera tests.

But I'm not happy with the setup. For one thing, it has far too many false positives. Maybe one out of ten or fifteen images actually has an animal in it; the rest just triggered because the wind made the leaves blow, or because a shadow moved or the color of the light changed. A simple count of differing pixels is clearly not enough for this task.

Of course, the software could be smarter about things: it could try to identify large blobs that had changed, rather than small changes (blowing leaves) all over the image. I already know SimpleCV runs fine on the Raspberry Pi, and I could try using it to do object detection.

But there's another problem with detection purely through camera images: the Pi is incredibly slow to capture an image. It takes around 20 seconds per cycle; some of that is waiting for the network but I think most of it is the Pi talking to the camera. With quick-moving animals, the animal may well be gone by the time the system has noticed a change. I've caught several images of animal tails disappearing out of the frame, including a quail who visited yesterday morning. Adding smarts like SimpleCV will only make that problem worse.

So I'm going to try another solution: hooking up an infra-red motion detector. I'm already working on setting up tests for that, and should have a report soon. Meanwhile, pure image-based motion detection has been an interesting experiment.

Tags: , , , ,
[ 13:31 Jun 26, 2014    More hardware | permalink to this entry | comments ]

Sat, 24 May 2014

Raspberry Pi Motion Camera: Part 2, using gphoto2

I wrote recently about the hardware involved in my Raspberry Pi motion-detecting wildlife camera. Here are some more details.

The motion detection software

I started with the simple and clever motion-detection algorithm posted by "brainflakes" in a Raspberry Pi forum. It reads a camera image into a PIL (Python Imaging Library) Image object, then compares bytes inside that Image's buffer to see how many pixels have changed, and by how much. It allows for monitoring only a test region instead of the whole image, and can even create a debug image showing which pixels have changed. A perfect starting point.

Camera support

As part of the PiDoorbell project, I had already written a camera wrapper that could control either a USB webcam or the pi camera module, if it was installed. Initially that plugged right in.

But I was unhappy with the Pi camera's images -- it can't focus closer than five feet (though a commenter to my previous article pointed out that it's possible to break the seal on the lens and refocus it manually. Without refocusing, the wide-angle lens means that a bird five feet away is pretty small, and even when you get something in focus the images aren't very sharp. And a web search for USB webcams with good optical quality was unhelpful -- the few people who care about webcam image quality seem to care mostly about getting the widest-angle lens possible, the exact opposite of what I wanted for wildlife.

[Motion detector camera with external  high-res camera] Was there any way I could hook up a real camera, and drive it from the Pi over USB as though it were a webcam? The answer turned out to be gphoto2.

But only a small subset of cameras are controllable over USB with gphoto2. (I think that's because the cameras don't allow control, not because gphoto doesn't support them.) That set didn't include any of the point-and-shoot cameras we had in the house; and while my Rebel DSLR might be USB controllable, I'm not comfortable about leaving it out in the backyard day and night.

With gphoto2's camera compatibility list in one tab and ebay in another, I looked for a camera that was available, cheap (since I didn't know if this was going to work at all), and controllable. I ordered a used Canon A520.

As I waited for it to arrive, I fiddled with my USB-or-pi-camera to make a start at adding gphoto2 support. I ended up refactoring the code quite a bit to make it easy to add new types of cameras besides the three it supports now -- pi, USB webcam, and gphoto2. I called the module pycamera.

Using gphoto2

When the camera arrived, I spent quite a while fiddling with gphoto2 learning how to capture images. That turns out to be a bit tricky -- there's no documentation on the various options, apparently because the options may be different for every camera, so you have to run

$ gphoto2 --set-config capture=1 --list-config
to get a list of options the camera supports, and then, for each of those options, run
$ gphoto2 --get-config name [option]
to see what values that option can take.

Dual-camera option

Once I got everything working, the speed and shutter noise of capturing made me wonder if I should worry about the lifespan of the Canon if I used it to capture snapshots every 15 seconds or so, day and night.

Since I still had the Pi cam hooked up, I fiddled the code so that I could use the pi cam to take the test images used to detect motion, and save the real camera for the high-resolution photos when something actually changes. Saves wear on the more expensive camera, and it's certainly a lot quieter that way.

Uploading

To get the images off the Pi to where other computers can see them, I use sshfs to mount a filesystem from another machine on our local net.

Unfortunately, sshfs on the pi doesn't work quite right. Apparently it uses out-of-date libraries (and gives a warning to that effect). You have to be root to use it at all, unlike newer versions of sshfs, and then, regardless of the permissions of the remote filesystem or where you mount it locally, you can only access the mounted filesystem as root.

Fortunately I normally run the motion detector as root anyway, because the picamera Python module requires it, and I've just gotten in the habit of using it even when I'm not using python-picamera. But if you wanted to run as non-root, you'd probably have to use NFS or some other remote filesystem protocol. Or find a newer version of sshfs.

Testing the gphoto setup

[Rock squirrel using Raspberry Pi camera] For reference, here's an image using the previous version of the setup, with the Raspberry Pi camera module. Click on the image to see a crop of the full-resolution image in daylight -- basically the best the camera can do. Definitely not what I was hoping for.

So I eagerly set up the tripod and hooked up the setup with the Canon. I had a few glitches in trying to test it. First, no birds; then later I discovered Dave had stolen my extension cord, but I didn't discover that until after the camera's batteries needed recharging.

A new extension cord and an external power supply for the camera, and I was back in business the next day.

[Rock squirrel using Raspberry Pi camera] And the results were worth it. As you can see here, using a real camera does make a huge difference. I used a zoom setting of 6 (it goes to 12). Again, click on the image to see a crop of the full-resolution photo.

In the end, I probably will order one of the No-IR Raspberry pi cameras, just to have an easy way of seeing what sorts of critters visit us at night. But for daylight shots, an external camera is clearly the way to go.

The scripts

The current version of the script is motion_detect.py and of course it needs my pycamera module. And here's documentation for the motion detection camera.

Tags: , , ,
[ 20:09 May 24, 2014    More hardware | permalink to this entry | comments ]

Thu, 15 May 2014

A Raspberry Pi motion-detecting wildlife camera

I've been working on an automated wildlife camera, to catch birds at the feeder, and the coyotes, deer, rabbits and perhaps roadrunners (we haven't seen one yet, but they ought to be out there) that roam the juniper woodland.

This is a similar project to the PiDoorbell project presented at PyCon, and my much earlier proximity camera project that used an Arduino and a plug computer but for a wildlife camera I didn't want to use a sonar rangefinder. For one thing, it won't work with a bird feeder -- the feeder is always there, so the addition of a bird won't change anything as far as a sonar rangefinder is concerned. For another, the rangefinders aren't very accurate beyond about six feet.

Starting with a Raspberry Pi was fairly obvious. It's low power, cheap, it even has an optional integrated camera module that has reasonable resolution, and I could re-use a lot of the camera code I'd already written for PiDoorbell.

I patched together some software for testing. I'll write in more detail about the software in a separate article, but I started with the simple motion detection code posted by "brainflakes" in the Raspberry Pi forums. It's a slick little piece of code you'll find in various versions all over the net; it uses PIL, the Python Imaging Library, to compare a specified region from successive photos to see how much has changed.

One aside about the brainflakes code: most of the pages you'll find referencing it tell you to install python-imaging-tk. But there's nothing in the code that uses tk, and python-imaging is really all you need to install. I wrote a GUI wrapper for my motion detection code using gtk, so I had no real need to learn the Tk equivalent.

Once I had some software vaguely working, it was time for testing.

The hardware

One big problem I had to solve was the enclosure. I needed something I could put the Pi in that was moderately waterproof -- maybe not enough to handle a raging thunderstorm, but rain or snow can happen here at any time without much warning. I didn't want to have to spend a lot of time building and waterproofing it, because this is just a test run and I might change everything in the final version.

I looked around the house for plastic objects that could be repurposed into a camera enclosure. A cookie container from the local deli looked possible, but I wasn't quite happy with it. I was putting the last of the milk into my morning coffee when I realized I held in my hand a perfect first-draft camera enclosure.

[Milk carton camera enclosure] A milk carton must be at least somewhat waterproof, right? Even if it's theoretically made of paper.

[cut a hole to mount the Pi camera] I could use the flat bottom as a place to mount the Pi camera with its two tiny screw holes,

[Finished milk cartnn camera enclosure] and then cut a visor to protect the camera from rain.

[bird camera, installed] It didn't take long to whip it all together: a little work with an X-acto knife, a little duct tape. Then I put the Pi inside it, took it outside and bungeed it to the fence, pointing at the bird feeder.

A few issues I had to resolve:

Raspbian has rather complicated networking. I was using a USB wi-fi dongle, but I had trouble getting the Pi to boot configured properly to talk to our WPA router. In Raspbian networking is configured in about six different places, any one of which might do something like prioritize the not-connected eth0 over the wi-fi dongle, making it impossible to connect anywhere. I ended up uninstalling Network Manager and turning off ifplugd and everything else I could find so it would use my settings in /etc/network/interfaces, and in the end, even though ifconfig says it's still prioritizing eth0 over wlan0, I got it talking to the wi-fi.

I also had to run everything as root. The python-picamera module imports RPi.GPIO and needs access to /dev/mem, and even if you chmod /dev/mem to give yourself adequate permissions, it still won't work except as root. But if I used ssh -X to the Pi and then ran my GUI program with sudo, I couldn't display any windows because the ssh permission is for the "pi" user, not root.

Eventually I gave up on sudo, set a password for root, and used ssh -X root@pi to enable X.

The big issue: camera quality

But the real problem turned out to be camera quality.

The Raspberry Pi camera module has a resolution of 2592 x 1944, or 5 megapixels. That's terrific, far better than any USB webcam. Clearly it should be perfect for this tast.

[House finch with the bad Raspberry Pi camera module] Update: see below. It's not a good camera, but it turns out I had a lens problem and it's not this bad.

So, the Pi camera module might be okay if all I want is a record of what animals visit the house. This image is good enough, just barely, to tell that we're looking at a house finch (only if we already rule out similar birds like purple finch and Cassin's finch -- the photo could never give us enough information to distinguish among similar birds). But what good is that? I want decent photos that I can put on my web site.

I have a USB camera, but it's only one megapixel and gives lousy images, though at least they're roughly in focus so they're better than the Pi cam.

So now I'm working on a setup where I drive an external camera from the Pi using gphoto2. I have most of the components working, but the code was getting ugly handling three types of cameras instead of just two, so I'm refactoring it. With any luck I'll have something to write about in a week or two.

Meanwhile, the temporary code is in my github rpi directory -- but it will probably move from there soon.

I'm very sad that the Pi camera module turned out to be so bad. I was really looking forward to buying one of the No-IR versions and setting up a night wildlife camera. I've lost enthusiasm for that project after seeing how bad the images were. I may have to investigate how to remove the IR filter from a point-and-shoot camera, after I get the daylight version working.

[rock squirrel with cheeks full of sunflower seeds] Update, a few days later: It turns out I had some spooge on the lens. It's not quite as bad as I made it out to be. Here's a sample. It's still not a great camera, and it can't focus anywhere near as close as the 2 feet I've seen claimed -- 5 feet is about the closest mine can focus, which means I can't get very close to the wildlife, which was a lot of the point of building a wildlife camera. I've seen suggestions of putting reading glasses in front of the lens as a cheap macro adaptor.

Instead, I'm going ahead with the gphoto2 option, which is about ready to test -- but the noIR Pi camera module might be marginally acceptable for a night wildlife camera.


Tags: , , ,
[ 13:30 May 15, 2014    More hardware | permalink to this entry | comments ]

Wed, 23 Apr 2014

Some code from PiDoorbell

If anyone has been waiting for the code repository for PiDoorbell, the Raspberry Pi project we presented at PyCon a couple of weeks ago, at least part of it (the parts I wrote) is also available in my GitHub scripts repo, in the rpi subdirectory. It's licensed as GPLv2-or-later.

That includes the code that drives the HC-SR04 sonar rangefinder, and the script that takes photos and handles figuring out whether you have a USB camera or a Pi Camera module.

It doesn't include the Dropbox or Twilio code. For that I'm afraid you'll have to wait for the official PiDoorbell repo. I'm not clear what the holdup is on getting the repo opened up.

The camera script, piphoto.py, has changed quite a bit in the couple of weeks since PyCon. I've been working on a similar project that doesn't use the rangefinder, and relies only on the camera to detect motion, by measuring changes between the previous photo and the current one. I'm building a wildlife camera, and the rangefinder trick doesn't work well if there's a bird feeder already occupying the target range.

Of course, using motion detection means I'll get a lot of spurious photos of shadows, tree limbs bending in the wind and so forth. It'll be an interesting challenge seeing if I can make the code smart enough to handle that. Of course, I'll write about the project in much more detail once I have it working.

It looks like the biggest issue will be finding a decent camera I can control from a Pi. The Pi Camera module looked so appealing -- and it comes in a night version, with the IR filter removed, perfect for those coyote, rabbit and deer pictures! -- but sadly, it looks like its quality is so poor that it really isn't useful for much of anything. It's great for detecting what types of animals visit you (especially at night), but, sadly, no good for taking photos you'd actually want to display.

If anyone knows of a good camera that can be driven from Linux over USB -- perhaps a normal digital camera that supports the USB camera protocol? -- please let me know! My web searches so far haven't been very illuminating.

Meanwhile, I hope someone finds the rangefinder and camera driving software useful. And stay tuned for more detailed articles about my wildlife camera project!

Tags: , ,
[ 11:57 Apr 23, 2014    More hardware | permalink to this entry | comments ]

Sun, 06 Apr 2014

Snow-Hail while preparing for Montreal

Things have been hectic in the last few days before I leave for Montreal with last-minute preparation for our PyCon tutorial, Build your own PiDoorbell - Learn Home Automation with Python next Wednesday.

[Snow-hail coming down on the Piñons] But New Mexico came through on my next-to-last full day with some pretty interesting weather. A windstorm in the afternoon gave way to thunder (but almost no lightning -- I saw maybe one indistinct flash) which gave way to a strange fluffy hail that got gradually bigger until it eventually grew to pea-sized snowballs, big enough and snow enough to capture well in photographs as they came down on the junipers and in the garden.

Then after about twenty minutes the storm stopped the sun came out. And now I'm back to tweaking tutorial slides and thinking about packing while watching the sunset light on the Rio Grande gorge.

But tomorrow I leave it behind and fly to Montreal. See you at PyCon!

Tags: , , , , ,
[ 18:55 Apr 06, 2014    More misc | permalink to this entry | comments ]

Wed, 29 Jan 2014

PyCon Tutorial: Build your own PiDoorbell - Learn Home Automation with Python

[Raspberry Pi from wikipedia] The first batch of hardware has been ordered for Rupa's and my tutorial at PyCon in Montreal this April!

We're presenting Build your own PiDoorbell - Learn Home Automation with Python on the afternoon of Wednesday, April 9.

It'll be a hands-on workshop, where we'll experiment with the Raspberry Pi's GPIO pins and learn how to control simple things like an LED. Then we'll hook up sonar rangefinders to the RPis, and build a little device that can be used to monitor visitors at your front door, birds at your feeder, co-workers standing in front of your monitor while you're away, or just about anything else you can think of.

Participants will bring their own Raspberry Pi computers and power supplies -- attendees of last year's PyCon got them there, but a new Model A can be gotten for $30, and a model B for $40.

We'll provide everything else. We worried that requiring participants to bring a long list of esoteric hardware was just asking for trouble, so we worked a deal with PyCon and they're sponsoring hardware for attendees. Thank you, PyCon! CodeChix is fronting the money for the kits and helping with our travel expenses, thanks to donations from some generous sponsors. We'll be passing out hardware kits and SD cards at the beginning of the workshop, which attendees can take home afterward.

We're also looking for volunteer T/As. The key to a good hardware workshop is having lots of helpers who can make sure everybody's keeping up and nobody's getting lost. We have a few top-notch T/As signed up already, but we can always use more. We can't provide hardware for T/As, but most of it's quite inexpensive if you want to buy your own kit to practice on. And we'll teach you everything you need to know about how get your PiDoorbell up and running -- no need to be an expert at hardware or even at Python, as long as you're interested in learning and in helping other people learn.

This should be a really fun workshop! PyCon tutorial sign-ups just opened recently, so sign up for the tutorial (we do need advance registration so we know how many hardware kits to buy). And if you're going to be at PyCon and are interested in being a T/A, drop me or Rupa a line and we'll get you on the list and get you all the information you need.

See you at PyCon!

Tags: , , , , ,
[ 20:32 Jan 29, 2014    More hardware | permalink to this entry | comments ]

Sat, 25 May 2013

Telling your Raspberry Pi that your terminal is bigger than 24 lines

When I'm working with an embedded Linux box -- a plug computer, or most recently with a Raspberry Pi -- I usually use GNU screen as my terminal program. screen /dev/ttyUSB0 115200 connects to the appropriate USB serial port at the appropriate speed, and then you can log in just as if you were using telnet or ssh.

With one exception: the window size. Typically everything is fine until you use an editor, like vim. Once you fire up an editor, it assumes your terminal window is only 24 lines high, regardless of its actual size. And even after you exit the editor, somehow your window will have been changed so that it scrolls at the 24th line, leaving the bottom of the window empty.

Tracking down why it happens took some hunting. Tthere are lots of different places the screen size can be set. Libraries like curses can ask the terminal its size (but apparently most programs don't). There's a size built into most terminfo entries (specified by the TERM environment variable) -- but it's not clear that gets used very much any more. There are environment variables LINES and COLUMNS, and a lot of programs read those; but they're often unset, and even if they are set, you can't trust them. And setting any of these didn't help -- I could change TERM and LINES and COLUMNS all I wanted, but as soon as I ran vim the terminal would revert to that scrolling-at-24-lines behavior.

In the end it turned out the important setting was the tty setting. You can get a summary of what the tty driver thinks its size is:

% stty size
32 80

But to set it, you use rows and columns rather than size. I discovered I could type stty rows 32 (or whatever my current terminal size was), and then I could run vim and it would stay at 32 rather than reverting to 24. So that was the important setting vim was following.

The basic problem was that screen, over a serial line, doesn't have a protocol for passing the terminal's size information, the way a remote login program like ssh, rsh or telnet does. So how could I get my terminal size set appropriately on login?

Auto-detecting terminal size

There's one program that will do it for you, which I remembered from the olden days of Unix, back before programs like telnet had this nice size-setting built in. It's called resize, and on Debian, it turned out to be part of the xterm package.

That's actually okay on my current Raspberry Pi, since I have X libraries installed in case I ever want to hook up a monitor. But in general, a little embedded Linux box shouldn't need X, so I wasn't very satisfied with this solution. I wanted something with no X dependencies. Could I do the same thing in Python?

How it works

Well, as I mentioned, there are ways of getting the size of the actual terminal window, by printing an escape sequence and parsing the result.

But finding the escape sequence was trickier than I expected. It isn't written about very much. I ended up running script and capturing the output that resize sent, which seemed a little crazy: '\e[7\e[r\e[999;999H\e[6n' (where \e means the escape character). Holy cow! What are all those 999s?

Apparently what's going on is that there isn't any sequence to ask xterm (or other terminal programs) "What's your size?" But there is a sequence to ask, "Where is the cursor on the screen right now?"

So what you do is send a sequence telling it to go to row 999 and column 999; and then another sequence asking "Where are you really?" Then read the answer: it's the window size.

(Note: if we ever get monitors big enough for 1000x1000 terminals, this will fail. I'm not too worried.)

Reading the answer

Okay, great, we've asked the terminal where it is, and it responds. How do we read the answer? That was actually the trickiest part.

First, you have to write to /dev/tty, not just stdout.

Second, you need the output to be available for your program to read, not just echo in the terminal for the user to see. Setting the tty to noncanonical mode does that.

Third, you can't just do a normal blocking read of stdin -- it'll never return. Instead, put stdin into non-blocking mode and use select() to see when there's something available to read.

And of course, you have to make sure you reset the terminal back to normal canonical line-buffered mode when you're done, whether or not your read succeeds.

Once you do all that, you can read the output, which will look something like "\e[32;80R". The two numbers, of course, are the lines and columns values you want; ignore the rest.

stty in python

Oh, yes, and one other thing: once you've read the terminal size, how do you set the stty size appropriately? You can't just run system('stty rows %d' % (rows) seems like it should work, but it doesn't, probably because it's using stdout instead of /dev/tty. But I did find one way to do it, the enigmatic:

fcntl.ioctl(fd, termios.TIOCSWINSZ,
            struct.pack("HHHH", rows, cols, 0, 0))

Here it all is in one script, which you can install on your Raspberry Pi (or other embedded Linux box) and run from .bash_profile:
termsize: set stty size to the size of the current terminal window.

Tags: , , , ,
[ 19:47 May 25, 2013    More hardware | permalink to this entry | comments ]

Sat, 18 May 2013

Running Raspberry Pi off a battery

In my post about Controlling a toy car with a Raspberry Pi, I skipped over one important detail: the battery. How do you power the RPi while it's driving around the room?

Most RPi sites warn that you shouldn't use the Pi with a power supply drawing less than an amp. I suspect that's overstated, and it probably doesn't draw more than half of that most of the time; but add the draw of two motors and we're talking a fairly beefy battery, not a couple of AAs or a 9V.

Luckily, as an R/C plane pilot, I have a fridge full of small 2- and 3-cell lithium-polymer batteries (and a li-po charger to go with them). The problem is: the Pi is rather picky about its input voltage. It wants 5V and nothing else. A 2-cell li-po is 7.4V. So I needed some sort of voltage regulator.

[5V voltage regulator] It's easy enough to get a simple 5V voltage regulator (pictured at right) -- 30c at Jameco, not much more locally. But they're apparently fairly inefficient, and need a heat sink for high current loads. [5V step-down power converter] So I decided to blow the big bucks ($15) for a 5V step-down power converter (left) that claims to be 94% efficient with no need for a heat sink.

Unlike most of Adafruit's products, this one comes with no tutorials and no hints as to pinouts, but after a little searching, I determined that the pins worked the same way as the cheap voltage regulators. With the red logo facing you, the left pin (your left) is input power from the battery; middle is ground (connect this to the battery's ground which is shared with the Pi's ground); the right pin is the regulated 5V output, which goes to pin 2 on the Pi's GPIO connector.

I was able to run both the RPi and the motor drive circuit off the same 7.4 volt 2-cell li-po battery (which almost certainly wouldn't work with 4 AAs, though it might work with 8). A 500 mAh battery seems to be plenty to drive the RPi and the car, though I don't know how long the battery life will be. I'll probably be using 610 mAh batteries for most of my testing, since I have a collection of them for the aerial combat planes.

Here's a wiring diagram made with Fritzing showing how to hook up the battery to power a RPi. If you're driving motors, you can run a line from the battery's + terminal (the left pin of the voltage regulator) as your motor voltage source, and use the right pin as your 5V logic source for whatever motor controller chip you're using.
[Battery-powered Raspberry Pi]

Tags: , ,
[ 17:50 May 18, 2013    More hardware | permalink to this entry | comments ]

Sun, 12 May 2013

Driving two DC motors with a Raspberry Pi

[Raspberry Pi robotic car]

In my previous article about pulse-width modulation on Raspberry Pi, I mentioned that the reason I wanted PWM on several pins at once was to drive several motors, for a robotic car.

But there's more to driving motors than just PWM. The GPIO output pins of a Pi don't have either enough current or enough voltage to drive a motor. So you need to use a separate power supply to drive the motors, and do some sort of switching -- at minimum, a transistor or relay for each motor.

There are lots of motor driver chips. For Arduinos, "motor shields", and such things are starting to become available for the Pi as well. But motor shields are expensive, usually more than the Pi costs itself. If you're trying to outfit a robotics class, or to help low-income students build robots, it's not a great solution.

When I struggled with this problem for the Arduino, the solution I eventually hit on was a SN754410 H-bridge chip. For under $2, you get bidirectional control of two DC motors. For each motor, you send input to the chip via a PWM line and two directional control lines.

[Snarl of wires driving a car with a Raspberry Pi] The only problem is the snarl of wiring. One PWM and two direction lines per motor is six wires, plus power for the chip's logic side, power for the motors, and ground, and the three pins for a serial cable, and you're talking a lot of wires to plug in. Although this is all easy in comcept, it's also easy to get a wire plugged in one spot over on the breadboard from where it ought to be, and then nothing works.

I spent too much time making tables of what should get plugged into where. I ended up with a table like this:
Pi connector pin GPIO (BCM) SN754410 pin
Pi 2 5V power Breadboard bottom V+ row
Pi 18 24 1 (motor 1 PWM)
Pi 15 22 1 (motor 0 PWM)
Pi 24 8 (SPI CE0) 4 (motor 1 direc 0)
Pi 26 7 (SPI CE1) 14 (motor 1 direc 1)
Pi 25 Gnd Breadboard both grounds
Pi 19 10 (MOS1) 3 (motor 0 direc 0)
Pi 21 9 (MOS0) 13 (motor 0 direc 1)
motor 0 5, 11
motor 1 6, 12
... though, as you'll see, some of those pin assignments ended up getting changed later.

One more thing: I found that I had to connect the chip's logic V+ (pin 2 on the SN754410) to the 5v pin on the RPi, not the 3.3V pin. The SN754410 is okay with 3.3V logic signals, but it seems to need a full 5V of power.

Programming it

The software control is a little trickier than it ought to be, too, because of the 2-wire control lines on each motor. With both lines high or both lines low, nothing moves. (Some motor driver chips distinguish between those two states: e.g. both low might be a brake, while both high lets the motor freewheel; but I haven't seen anything indicating the SN754410 makes any distinction.) Then set one line high, the other low, and the motor spins one way; reverse the lines, and the motor spins the other way. Assuming, of course, the PWM line is sending a signal.

Of course, you need RPI.GPIO version 0.5.2a or later to do any of this PWM control. Get it via pip install --upgrade RPi.GPIO -- the RPI.GPIO in Raspbian mis-reports its version and is really 0.5.1a.

Simple enough in concept. Okay, now try explaining that to beginning programmers. No, thanks! So I wrote a PiMotor class in Python that takes care of all those details. Initialize it with the pins you want to use, then use calls like set_speed(s) and stop(). It's on GitHub at pimotors.py.

I put the H-bridge chip on a breadboard, wired up all the lines to the Pi and a lithium-polymer airplane battery, and (after several hours of head-banging while I found all the errors in my wiring), sure enough, I could get the motors to spin.

But one thing I found while wiring was that I couldn't always use the GPIO lines I'd intended to use. The RPi has seemingly a lot of GPIO lines -- but nearly all of the GPIO lines have other purposes, except I haven't found any good explanation of what those uses are and how to know when they're in use. I found that quite frequently, I'd try a GPIO.setup(pin, GPIO.OUT) and get "This channel is already in use". Sometimes GPIO.cleanup() helped, and sometimes it didn't. None of this stuff has much documentation, and I haven't found any IRC channel or mailing list for discussing RPi GPIO. And of course, there's no relation between the pin number on the header and the GPIO pin number. So I spent a lot of time counting breadboard rows and correlating to a printout I'd made of the RPi's GPIO socket.

Putting the circuit on a proto-board

Once I got it working, I realized how much I didn't relish the thought of ever doing it again -- like whenever I needed to unplug the motors from the Pi and use it for something else.

Fortunately, at some point I'd bought an Adafruit Pi Plate, sort of the RPi equivalent of Adafruit's Arduino ProtoShield. I love protoshields. I have a bunch of them, and I use them for all sorts of Arduino projects, so I'd bought the Pi Plate thinking it might come in handy some day. It's not quite like a protoshield, because it's expensive and heavy, loaded up with lots of pointless screw terminals. But you don't have to solder the screw terminals on; just solder the headers and you have a protoshield for your RPi on which you can put a mini breadboard and build your motor circuit.

I do wish, though, that Adafruit or someone made a simple, basic proto board PCB with headers for the Pi. No screw terminals, no extra parts, just the PCB and headers, to make it easy and cheap to swap between different RPi projects. The HobbyTronics Slice of Pi looks intriguing, but the GPIO pins it exposes don't seem to be the same ones exposed on the RPI's GPIO header. I'd be interested in hearing from anyone who's tried one of these.

[Raspberry Pi motor circuitn] Anyway, with the Pi Plate shield, my motor circuit looks much neater, and I can unplug it from my RPi without fear that it'll mean another half hour if I ever want to get the motors hooked up again. I did have to change some of the pin assignments yet again, because the Pi Plate doesn't expose all the GPIO pins available on the RPi header. I ended up using 25, 23, 24 for the first motor, and 17, 21, 22 for the second.

I wanted to make a circuit diagram with Fritzing, but it turns out the Fritzing I have can't import part definitions like the one for Raspberry Pi, and the current Fritzing doesn't work on Debian Wheezy. So that'll have to wait. But here's a photo of my breadboarded circuit on the Pi Plate, and a link to my motor breadboarded circuit using a cable to the GPIO.

Kevin Mark tipped me off that Fritzing is quite easy to build under Debian, if you first apt-get install qt4-qmake libqt4-dev libboost1.49-dev
I had to add one more package to Kevin's list, libqt4-sql-sqlite, or I got a lot of QSQLITE driver not loaded and other errors on the terminal, and a dialog saying "Unable to find the following 114 parts" followed by another dialog too big to fit on the screen with a list of all the missing parts.
Once those packages are installed, download the Fritzing source tarball, qmake, make, and sudo make install.

And my little car can go forward, spin around in both directions, and then reverse! Now the trick will be to find some sensors I can use with the pins remaining ...

Tags: , ,
[ 14:08 May 12, 2013    More hardware | permalink to this entry | comments ]

Sat, 04 May 2013

PWM for LEDs and motors with a Raspberry Pi

I've written about how to drive small DC motors with an Arduino, in order to drive a little toy truck around. But an Arduino, while great at talking to hardware, isn't very powerful. It's easy to add simple sensors to the truck so it can stop before hitting the wall; but if I wanted to do anything complicated -- like, say, image processing with a camera -- the Arduino really isn't enough.

[Raspberry Pi set up for motor control] Enter Raspberry Pi. It isn't a super-fast processor either, but it's fast enough to run Linux, Python, and image processing packages like SimpleCV. A Raspberry-Pi driven truck would be a lot more powerful: in theory, I could make a little Mars Rover to drive around my backyard. If, that is, I could get the RPi driving the car's motors.

Raspberry Pi, sadly, has a lot of limitations as a robotics platform. It's picky about input voltages and power; it has no analog inputs, and only one serial port (which you probably want to use for a console if you're going to debug your robot reliably). But my biggest concern was that it has only one pulse-width modulation (PWM) output, while I needed two of them to control the car's two motors. It's theoretically possible to do software PWM on any pin -- but until recently, there were no libraries supporting that.

Until recently. I've been busy for the last month or two and haven't been doing much RPi experimenting. As I got back into it this week, I discovered something delightful: in the widely available python library RPi.GPIO, Software PWM is available starting with 0.5.2a.

Getting the right RPi.GPIO

Just what I'd been wanting! So I got an LED and resistor and plugged them into a breadboard. I ran a black wire from the RPi's pin 6, ground, to the short LED pin, and connected the long pin via the resistor to the RPi's pin 18 (GPIO 24) (see the RPi Low-level peripherals for the official GPIO pin diagrams).

With the LED wired up, I plugged in my serial cable, powered up the RPi with its Raspbian SD card, and connected to it with screen /dev/ttyUSB0 115200. I configured the network to work on my local net and typed sudo apt-get install python-rpi.gpio to get the latest version. It got 0.5.2a-1. Hooray!

I hurried to do a test:

pi@raspberrypi:~$ sudo python
Python 2.7.3 (default, Jan 13 2013, 11:20:46) 
[GCC 4.6.3] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> 
>>> import RPi.GPIO as GPIO
>>> GPIO.setmode(GPIO.BCM)
>>> GPIO.setup(24, GPIO.OUT)
>>> led = GPIO.PWM(24, 100)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
AttributeError: 'module' object has no attribute 'PWM'

Whoops! But Raspbian said it was the right version ... I checked again with aptitude show python-rpi.gpio -- yep, 0.5.2a-1. Hmph!

After some poking around, I discovered that help(GPIO), after printing out an interminable list of exception classes, eventually gets to this:

    VERSION = '0.5.1a'

In other words, Rapsbian is fibbing: that package that Raspbian says is version 0.5.2a-1 is actually version 0.5.1a. (This is the sort of thing that makes Raspberry Pi such a joy to work with. Yes, that's sarcasm.)

Okay. Let's try removing that bogus Raspbian package and getting it from pypi instead:

apt-get remove python-rpi.gpio
pip install --upgrade RPi.GPIO

Then I tried the same test as before. Success! And now I was able to set the LED to half brightness:

led.start(50)

I was able to brighten and dim the LED at will:

led.ChangeDutyCycle(90)
led.ChangeDutyCycle(25)

I played with it a little while longer, then cleaned up:

led.stop()
GPIO.cleanup()

If you're experimenting with RPi.GPIO's PWM, you'll want to check out this useful 2-part tutorial:

What about motors?

So PWM works great for LEDs. But would it drive my little robotic car?

I unplugged my LED and wired up one of the SN754410 motor drivers circuits I'd wired up for the Arduino. And it worked just as well! I was able to control the motor speed using ChangeDutyCycle().

I'll write that up separately, but I do have one caveat: GPIO.cleanup(), for some reason, sets the pin output to HIGH. So if you have your car plugged in and sitting on the ground when you run cleanup(), it will take off at full speed. I recommend testing with the car on a stand and the wheels off the ground.

Update: the motor post is up now, at Driving two DC motors with a Raspberry Pi.


Tags: , , ,
[ 21:00 May 04, 2013    More hardware | permalink to this entry | comments ]

Sat, 16 Mar 2013

SimpleCV on Raspberry Pi

I'm at PyCon, and I spent a lot of the afternoon in the Raspberry Pi lab.

Raspberry Pis are big at PyCon this year -- because everybody at the conference got a free RPi! To encourage everyone to play, they have a lab set up, well equipped with monitors, keyboards, power and ethernet cables, plus a collection of breadboards, wires, LEDs, switches and sensors.

I'm primarily interested in the RPi as a robotics controller, one powerful enough to run a camera and do some minimal image processing (which an Arduino can't do). And on Thursday, I attended a PyCon tutorial on the Python image processing library SimpleCV. It's a wrapper for OpenCV that makes it easy to access parts of images, do basic transforms like greyscale, monochrome, blur, flip and rotate, do edge and line detection, and even detect faces and other objects. Sounded like just the ticket, if I could get it to work on a Raspberry Pi.

SimpleCV can be a bit tricky to install on Mac and Windows, apparently. But the README on the SimpleCV git repository gives an easy 2-line install for Ubuntu. It doesn't run on Debian Squeeze (though it installs), because apparently it depends on a recent version of pygame and Squeeze's is too old; but Ubuntu Pangolin handled it just fine.

The question was, would it work on Raspbian Wheezy? Seemed like a perfect project to try out in the PyCon RPi lab. Once my RPi was set up and I'd run an apt-get update, I used used netsurf (the most modern of the lightweight browsers available on the RPi) to browse to the SimpleCV installation instructions. The first line,

sudo apt-get install ipython python-opencv python-scipy python-numpy python-pygame python-setuptools python-pip
was no problem. All those packages are available in the Raspbian repositories.

But the second line,

sudo pip install https://github.com/ingenuitas/SimpleCV/zipball/master
failed miserably. Seems that pip likes to put its large downloaded files in /tmp; and on Raspbian, running off an SD card, /tmp quite reasonably is a tmpfs, running in RAM. But that means it's quite small, and programs that expect to be able to use it to store large files are doomed to failure.

I tried a couple of simple Linux patches, with no success. You can't rename /tmp to replace it with a symlink to a directory on the SD card, because /tmp is always in use. And pip makes a new temp directory name each time it's run, so you can't just symlink the pip location to a place on the SD card.

I thought about rebooting after editing the tmpfs out of /etc/fstab, but it turns out it's not set up there, and it wasn't obvious how to disable the tmpfs. Searching later from home, the size is set in /etc/default/tmpfs. As for disabling the tmpfs and using the SD card instead, it's not clear. There's a block of code in /etc/init.d/mountkernfs.sh that makes that decision; it looks like symlinking /tmp to somewhere else might do it, or else commenting out the code that sets RAMTMP="yes". But I haven't tested that.

Instead of rebooting, I downloaded the file to the SD card:

wget https://github.com/ingenuitas/SimpleCV/master

But it turned out it's not so easy to pip install from a local file. After much fussing around I came up with this, which worked:

pip install http:///home/pi/master --download-cache /home/pi/tmp

That worked, and the resulting SimpleCV install worked nicely! I typed some simple tests into the simplecv shell, playing around with their built-in test image "lenna":

img = Image('lenna')
img.show()
img.binarize().show()
img.toGray().show()
img.edges().show()
img.invert().show()

And, for something a little harder, some face feature detection: let's find her eyes and outline them in yellow.

img.listHaarFeatures()
img.findHaarFeatures('eye.xml').draw(color=Color.YELLOW)
[Lenna, edges] [Lenna, eyes detected]

SimpleCV is lots of fun! And the edge detection was quite fast on the RPi -- this may well be usable by a robot, once I get the motors going.

Tags: , , , ,
[ 21:43 Mar 16, 2013    More linux/install | permalink to this entry | comments ]

Sun, 06 Jan 2013

Rescuing a wrongly soldered box header connector

For a recent Raspberry Pi project, I decided to use the Adafruit Pi Cobbler to give me easy access to the RPi's GPIO pins.

My Cobbler arrived shortly before I had to leave for a short trip. I was planning to take my RPi with me -- but not my soldering iron. So the night before I left, I hastily soldered together the Cobbler along with a few other parts I wanted to play with. No problem -- it's an easy solder project, lots of pins but no delicate parts or complicated circuitry.

Later, far from home, I opened up my hardware hack box, set up a breadboard and started plugging in wires, following one of the tutorials mentioned below. Except -- wait, the pins didn't seem to be in the place I expected them. I quickly realized I'd soldered the ribbon cable connector on backward. Argh!

There's no way I could unsolder 26 pins all at once, even at home; but away from home, without even a soldering iron, how could I possibly recover?

[ribbon cable connector] (image courtesy of PANAMATIK of Wikipedia)

The ribbon cable connector is basically symmetrical, two rows of 13 pins. The connector on the cable is keyed -- it has a dingus sticking out of it that's supposed to fit into the slot in the connector's plastic box. If I could, say, cut another slot on the opposite side of the plastic box, something big enough for the ribbon cable's sticky-out dingus (sorry for the highly technical language!), I could salvage this project and play with my RPi.

I was just about to dig in with the diagonal cutter when someone on IRC suggested that I try to slide the plastic box (it turns out this is called a "box header") up off the pins, turn it around and slide it back on. They suggested that using a heat gun to soften the plastic might help.

I didn't have a heat gun where I was staying, but I did have a hairdryer. I slipped a jeweler's screwdriver under the bottom of one side of the box, levered against the circuit board to apply pressure upward, and hit it with the hairdryer. It slid a few millimeters immediately.

I switched to the other side of the box and repeated; that side obligingly slid too. About ten minutes of alternating sides and occasional blasts from the hairdryer, and the box was off! Sliding it back on was even easier. Project rescued!

(Incidentally, you may be thinking that the Cobbler is really just a way to connect the Pi's header pins to a breadboard. I could have used the backwards-soldered Cobbler and just kept track of which pins should map to which other pins. True! But all the pin numbers would have been mislabeled, and I know myself well enough to know that eventually, I would have gotten the pin mapping wrong and plugged something in to the wrong place. Having destroyed an Adafruit Wave Shield earlier that day by doing just that, connecting 5V to an I/O pin that it turned out wasn't expecting it (who knew the SD reader chip was so sensitive?), I didn't want to take the same risk with my only Raspberry Pi.)

Tags: , , ,
[ 16:29 Jan 06, 2013    More hardware | permalink to this entry | comments ]

Fri, 09 Nov 2012

How to talk to your Rapsberry Pi over an ethernet crossover cable with IP masquerading

I've been using my Raspberry Pi mostly headless -- I'm interested in using it to control hardware. Most of my experimenting is at home, where I can plug the Pi's built-in ethernet directly into the wired net.

But what about when I venture away from home, perhaps to a group hacking session, or to give a talk? There's no wired net at most of these places, and although you can buy USB wi-fi dongles, wi-fi is so notoriously flaky that I'd never want to rely on it, especially as my only way of talking to the Pi.

Once or twice I've carried a router along, so I could set up my own subnet -- but that means an extra device, ten times as big as the Pi, and needing its own power supply in a place where power plugs may be scarce.

The real solution is a crossover ethernet cable. (My understanding is that you can't use a normal ethernet cable between two computers; the data send and receive lines will end up crossed. Though I may be wrong about that -- one person on #raspberrypi reported using a normal ethernet cable without trouble.)

Buying a crossover cable at Fry's was entertaining. After several minutes of staring at the dozens of bins of regular ethernet cables, I finally found the one marked crossover, and grabbed it. Immediately, a Fry's employee who had apparently been lurking in the wings rushed over to warn me that this wasn't a normal cable, this wasn't what I wanted, it was a weird special cable. I thanked him and assured him that was exactly what I'd come to buy.

Once home, with my laptop connected to wi-fi, I plugged one end into the Pi and the other end into my laptop ... and now what? How do I configure the network so I can talk to the Pi from the laptop, and the Pi can gateway through the laptop to the internet?

The answer is IP masquerading. Originally I'd hoped to give the Pi a network address on the same networking (192.168.1) as the laptop. When I use the Pi at home, it picks a network address on 192.168.1, and it would be nice not to have to change that when I travel elsewhere. But if that's possible, I couldn't find a way to do it.

Okay, plan B: the laptop is on 192.168.1 (or whatever network the wi-fi happens to assign), while the Pi is on a diffferent network, 192.168.0. That was relatively easy, with some help from the Masquerading Simple Howto.

Once I got it working, I wrote a script, since there are quite a few lines to type and I knew I wouldn't remember them all. Of course, the script has to be run as root. Here's the script, on github: masq.

I had to change one thing from the howto: at the end, when it sets up security, this line is supposed to enable incoming connections on all interfaces except wlan0:

iptables -A INPUT -m state --state NEW -i ! wlan0 -j ACCEPT

But that gave me an error, Bad argument `wlan0'. What worked instead was

iptables -A INPUT -m state --state NEW ! -i wlan0 -j ACCEPT
Only a tiny change: swap the order of -i and !. (I sent a correction to the howto authors but haven't heard back yet.)

All set! It's a nice compact way to talk to your Pi anywhere. Of course, don't forget to label your crossover cable, so you don't accidentally try to use it as a regular ethernet cable. Now please excuse me while I go label mine.

Update: Ed Davies has a great followup, Crossover Cables and Red Tape, that talks about how to set up a subnet if you don't need the full masquerading setup, why non-crossover cables might sometimes work, and a good convention for labeling crossover cables: use red tape. I'm going to adopt that convention too -- thanks, Ed!

Tags: , , ,
[ 16:57 Nov 09, 2012    More hardware | permalink to this entry | comments ]

Tue, 31 Jul 2012

Raspberry Pi quickstart: headless setup (no monitor)

Raspberry Pi, the tiny, cheap, low-power Linux computer, dropped their order restrictions a few weeks ago, and it finally became possible for anyone to order one. I immediately (well, a day later, since the two sites that sell them were slashdotted with people trying to order) put in an order with Newark/element14. They said they were backordered six weeks, but I wasn't in a hurry -- I just wanted to get in the queue.

Imagine my surprise when half a week later I got a notice that my Pi had shipped! I got it yesterday. Thanks, Element14!

The Pi comes with no OS preloaded -- it boots off the SD card. a download page where you can get an image of Debian Wheezy their recommendation), Arch, or several other Linux distros. I downloaded their latest Wheezy image and unzipped it.

But instructions on what to do from there are scanty, and tend to be heavy on "click on this, then drag to here" directives that make no sense if you're not using whatever desktop they assume you have. So here's what ended up working.

Writing the SD card with dd

First, make sure you downloaded the image correctly: sha1sum 2012-07-15-wheezy-raspbian.zip and compare the sum it prints out with the one on the download page.

Then get an appropriate SD card. The image is sized for a 2G card, so that's what I used, but you can use a larger card if needed ... you'll only get 2G initially but you can resize the partition later.

Plug the SD card into a reader on your regular Linux desktop/laptop machine, and figure out which device it is: I used cat /proc/partitions.

Then, assuming the SD card is in /dev/sdb (make sure of this! you don't want to destroy your system disk by writing the wrong place!)

dd bs=1M if=2012-07-15-wheezy-raspbian.img of=/dev/sdb
sync
Wait a while, make sure all the lights are off in your SD drive, then remove the SD card from the reader. (Yes, even if you're about to mount it to change something.)

Headless Raspberry Pi

Now you have an SD card that will probably boot your Pi. If you want to run X on it and see a desktop, you'll need a USB keyboard and mouse, some sort of monitor, and the appropriate cable. That stopped me. The Pi needs either an HDMI to DVI cable -- which I don't have, though I will buy one as soon as I get a chance -- or an RCA composite video cable. I think our old CRT TV can take composite video, but what I see on the net suggests this is a poor solution for the Pi since the resolution and image quality aren't really adequate.

But in any case, one of my main intended uses for the Pi involves using it headless, as a robotics controller, in connection with an Arduino or other hardware for analog device control. So the Pi needs to be able to boot without a monitor, taking commands via its built-in ethernet interface, probably using ssh. That means making some changes to the SD card.

Reinsert the card. (Why not just leave it in place? Because the image you just wrote changed the partition table, and your computer won't see the change unless you remove and reinsert the card.)

The card now has two partitions on it -- you can check that via /proc/partitions. The first is the /boot partition, where you shouldn't need to change anything. The second is the root filesystem. Mount the second partition if your system didn't do that automatically:

mount /dev/sdb2 /mnt

Now specify a static IP address, so you'll always know how to get to your Pi. Edit /mnt/etc/network/interfaces and change the iface eth0 inet dhcp line to something like this, using numbers that will work for your local network:

iface eth0 inet static
address 192.168.1.50
netmask 255.255.255.0
gateway 192.168.1.1

Now, if you google for other people who want to ssh in to their Raspberry Pis or run them headless, you will find approximately 1,532,776 pages telling you that to enable sshd you'll need to rename a file named boot_enable_ssh.rc somewhere on the /boot partition to boot.rc. Disregard this. There is no such file on the current official wheezy pi images, and you will go crazy looking for it.

Happily, it turns out that the current images have the ssh server enabled by default. You can verify that by looking at /mnt/etc/init.d/ssh and seeing that it starts sshd. So after setting a static IP, you're ready to umount /mnt

You're done! Remove the card, stick it in the Raspberry Pi, plug in an ethernet cable, then power it with a micro USB cable. Wait a minute or two (it's not the world's fastest booter, and you should be able to ssh pi@192.168.1.50 or whatever address you gave it. Log in with the password specified on the Downloads page where you got the OS image ... and you're good to go.

Fun! Now I'm off to find an HDMI-DVI cable.

Tags: , ,
[ 21:26 Jul 31, 2012    More hardware | permalink to this entry | comments ]

Syndicated on:
LinuxChix Live
Ubuntu Women
Women in Free Software
Graphics Planet
DevChix
Ubuntu California
Planet Openbox
Devchix
Planet LCA2009

Friends' Blogs:
Morris "Mojo" Jones
Jane Houston Jones
Dan Heller
Long Live the Village Green
Ups & Downs
DailyBBG

Other Blogs of Interest:
DevChix
Scott Adams
Dave Barry
BoingBoing

Powered by PyBlosxom.